Real-time Large-scale Deformation of Gaussian Splatting

 

Lin Gao1,2 *          Jie Yang1           Bo-tao Zhang1,2          Jia-mu Sun1,2          Yu-jie Yuan1,2          Hongbo Fu3       Yu-Kun Lai4         

 

 

1 Institute of Computing Technology, Chinese Academy of Sciences

 

2 University of Chinese Academy of Sciences

 

3 The Hong Kong University of Science and Technology      

 

4 Cardiff University      

 

* Corresponding author  

 

 

 

Accepted by Siggraph Asia 2024

 

 

 

 

Figure: Given a set of multi-view images of an object (left), we reconstruct the object with the proposed mesh-based Gaussian splatting representation, involving both 3D Gaussians and an associated mesh. The mesh is adaptively refined along with Gaussian splitting, and also served as an effective regularization. As a result, our method achieves higher-quality novel view synthesis than 3D Gaussian splatting, evenly for large-scale deformation. Our 3D Gaussian deformation method produces high-quality deformation results in a real-time manner with large-scale deformations.

 

 

 

Abstract

 

Neural implicit representations, including Neural Distance Fields and Neural Radiance Fields, have demonstrated significant capabilities for reconstructing surfaces with complicated geometry and topology, and generating novel views of a scene. Nevertheless, it is challenging for users to directly deform or manipulate these implicit representations with large deformations in a real-time fashion. Gaussian Splatting (GS) has recently become a promising method with explicit geometry for representing static scenes and facilitating high-quality and real-time synthesis of novel views. However, it cannot be easily deformed due to the use of discrete Gaussians and the lack of explicit topology. To address this, we develop a novel GS-based method that enables interactive deformation. Our key idea is to design an innovative mesh-based GS representation, which is integrated into Gaussian learning and manipulation. 3D Gaussians are defined over an explicit mesh, and they are bound with each other: the rendering of 3D Gaussians guides the mesh face split for adaptive refinement, and the mesh face split directs the splitting of 3D Gaussians. Moreover, the explicit mesh constraints help regularize the Gaussian distribution, suppressing poor-quality Gaussians (e.g. , misaligned Gaussians, long-narrow shaped Gaussians), thus enhancing visual quality and reducing artifacts during deformation. Based on this representation, we further introduce a large-scale Gaussian deformation technique to enable deformable GS, which alters the parameters of 3D Gaussians according to the manipulation of the associated mesh. Our method benefits from existing mesh deformation datasets for more realistic data-driven Gaussian deformation. Extensive experiments show that our approach achieves highquality reconstruction and effective deformation, while maintaining the promising rendering results at a high frame rate (65 FPS on average on a single commodity GPU).

 

 

 

 

Video

 

 

 

Paper


PDF

Code


Github

BibTex

 

@article {MeshGaussian2024,
    author = {Gao, Lin and Yang, Jie and Zhang, Botao and Sun, Jiamu and Yuan, Yujie and Fu, Hongbo and Lai, Yu-Kun},
    title = {Real-time Large-scale Deformation of Gaussian Splatting},
    journal = {ACM Transactions on Graphics (SIGGRAPH Asia 2024)},
    year = {2024},
}

 

 

Acknowledgments