GaussianArt: Unified Modeling of Geometry and Motion for Articulated Objects

Licheng Shen1,2* Saining Zhang1,2,3* Honghan Li1,2,3* Peiling Yang1,4
Zihao Huang1,5 Zongzheng Zhang1,2 Hao Zhao2,1†
1Beijing Academy of Artificial Intelligence 2AIR, Tsinghua University 3Nanyang Technological University 4Beijing Institute of Technology 5Huazhong University of Science and Technology

3DV 2026

Architecture Comparison

Abstract

Reconstructing articulated objects is essential for building digital twins of interactive environments. However, prior methods typically decouple geometry and motion by first reconstructing object shape in distinct states and then estimating articulation through post-hoc alignment. This separation complicates the reconstruction pipeline and restricts scalability, especially for objects with complex, multi-part articulation. We introduce a \textbf{unified} representation that jointly models geometry and motion using articulated 3D Gaussians. This formulation improves robustness in motion decomposition and supports articulated objects with up to 20 parts, significantly outperforming prior approaches that often struggle beyond 2–3 parts due to brittle initialization. To systematically assess scalability and generalization, we propose MPArt-90, a new benchmark consisting of 90 articulated objects across 20 categories, each with diverse part counts and motion configurations. Extensive experiments show that our method consistently achieves superior accuracy in part-level geometry reconstruction and motion estimation across a broad range of object types. We further demonstrate applicability to downstream tasks such as robotic simulation and human-scene interaction modeling, highlighting the potential of unified articulated representations in scalable physical modeling.

Comparisons on MPArt-90



Simulation in IsaacSim



Human-Scene Interaction



BibTeX

@article{shen2025gaussianart,
  title={Gaussianart: Unified modeling of geometry and motion for articulated objects},
  author={Shen, Licheng and Zhang, Saining and Li, Honghan and Yang, Peilin and Huang, Zihao and Zhang, Zongzheng and Zhao, Hao},
  journal={arXiv preprint arXiv:2508.14891},
  year={2025}
}
}