Motion Blender Gaussian Splatting for Dynamic Reconstruction

Xinyu Zhang, Haonan Chang , Yuhan Liu , Abdeslam Boularias

Abstract

Gaussian splatting has emerged as a powerful tool for high-fidelity reconstruction of dynamic scenes. However, existing methods primarily rely on implicit motion representations, such as encoding motions into neural networks or per-Gaussian parameters, which makes it difficult to further manipulate the reconstructed motions. This lack of explicit controllability limits existing methods to replaying recorded motions only, which hinders a wider application. To address this, we propose Motion Blender Gaussian Splatting (MB-GS), a novel framework that uses motion graph as an explicit and sparse motion representation. The motion of graph links is propagated to individual Gaussians via dual quaternion skinning, with learnable weight painting functions determining the influence of each link. The motion graphs and 3D Gaussians are jointly optimized from input videos via differentiable rendering. Experiments show that MB-GS achieves state-of-the-art performance on the iPhone dataset while being competitive on HyperNeRF. Additionally, we demonstrate the application potential of our method in generating novel object motions and robot demonstrations through motion editing.

Method Overview

MY ALT TEXT

Novel Pose Animation

Robot Demonstrations Synthesis

Learned Motion Graphs and Weight Paintings.

MY ALT TEXT

Qualitative Comparison

iPhone Dataset

HyperNerf Dataset

MB-GS achieves state-of-the-art performance on the iPhone dataset while being competitive on HyperNeRF.

BibTeX

@article{zhang2025motion,
    title={Motion Blender Gaussian Splatting for Dynamic Reconstruction},
    author={Zhang, Xinyu and Chang, Haonan and Liu, Yuhan and Boularias, Abdeslam},
    journal={arXiv preprint arXiv:2503.09040},
    year={2025}
}