文章目录
1. What
A novel point-based approach with a novel Dual-Domain Deformation Model for dynamic scene reconstruction.
Contribution:
- Gaussian-Flow, which is a novel point-based differentiable rendering approach for dynamic 3D scene reconstruction, setting a new sota for training speed, rendering FPS, and novel view synthesis quality for 4D scene reconstruction.
- Propose a Dual-Domain Deformation Model for efficient 4D scene training and rendering, which preserves a running speed on par with the original 3DGS with minimum overhead.
- Can be used for downstream tasks
2. Why
2.1 Introduction
- NeRF still remains a challenge for high-fidelity real-time rendering.
- 3DGS has been used on 4D tasks but it significantly lowers the rendering speed of the original 3DGS.
2.2 Related work
Remarkable work
- Dynamic Neural Radiance Field: dynamic neural scene flow methods have been proposed [27, 30],
- Accelerated Neural Radiance Field
- Differentiable Point-based Rendering: PointRF [41], DSS [39], and 3D Gaussians splatting(3DGS) [13].
3. How
3.1 Dual-Domain Deformation Model
Assume that only the rotation q q q, radiance c c c, and position μ \mu μ of a 3D Gaussian particle change over time, while the scaling s s s and opacity α \alpha α remain constant.
Then, we use a time-dependent attribute residual D ( t ) D(t) D(t) to adjust the error between the base attribute S 0 ∈ { μ 0 , c 0 , q 0 } S_{0}\in\{\mu_{0},c_{0},q_{0}\} S0∈{ μ0,c0,q0} and the attribute at time t t t. This is:
S ( t ) = S 0 + D ( t ) , S(t)=S_{0}+D(t), S(t)=S0+D(t),
where D ( t ) = P N ( t ) + F L ( t ) D(t)=P_{N}(t)+F_{L}(t) D(t)=PN(t)+<