Articulated objects, as prevalent entities in human life, their 3D representations play crucial roles across various applications. However, achieving both high-fidelity textured surface reconstruction and dynamic generation for articulated objects remains challenging for existing methods. In this paper, we present REArtGS, a novel framework that introduces additional geometric and motion constraints to 3D Gaussian primitives, enabling realistic surface reconstruction and generation for articulated objects. Specifically, given multi-view RGB images of arbitrary two states of articulated objects, we first introduce an unbiased Signed Distance Field (SDF) guidance to regularize Gaussian opacity fields, enhancing geometry constraints and improving surface reconstruction quality. Then we establish deformable fields for 3D Gaussians constrained by the kinematic structures of articulated objects, achieving unsupervised generation of surface meshes in unseen states. Extensive experiments on both synthetic and real datasets demonstrate our approach achieves high-quality textured surface reconstruction for given states, and enables high-fidelity surface generation for unseen states. Project site: https://sites.google.com/view/reartgs/home.
翻译:铰接物体作为人类生活中普遍存在的实体,其三维表示在各类应用中扮演着关键角色。然而,现有方法难以同时实现铰接物体的高保真纹理表面重建与动态生成。本文提出REArtGS,一种新颖的框架,通过为三维高斯基元引入额外的几何与运动约束,实现了铰接物体的逼真表面重建与生成。具体而言,给定铰接物体任意两个状态的多视角RGB图像,我们首先引入无偏符号距离场(SDF)引导来正则化高斯不透明度场,以增强几何约束并提升表面重建质量。随后,我们基于铰接物体的运动学结构建立三维高斯可变形场,实现了未见状态下表面网格的无监督生成。在合成与真实数据集上的大量实验表明,我们的方法能够对给定状态实现高质量的纹理表面重建,并能对未见状态进行高保真的表面生成。项目网站:https://sites.google.com/view/reartgs/home。