Chen, Jiahao, Qin, Yipeng ![]() |
Preview |
PDF
- Accepted Post-Print Version
Download (46MB) | Preview |
Abstract
Neural Radiance Field (NeRF) has been widely recognized for its excellence in novel view synthesis and 3D scene reconstruction. However, their effectiveness is in-herently tied to the assumption of static scenes, rendering them susceptible to undesirable artifacts when confronted with transient distractors such as moving objects or shad-ows. In this work, we propose a novel paradigm, namely “Heuristics-Guided Segmentation” (HuGS), which signifi-cantly enhances the separation of static scenes from tran-sient distractors by harmoniously combining the strengths of hand-crafted heuristics and state-of-the-art segmentation models, thus significantly transcending the limitations of previous solutions. Furthermore, we delve into the metic-ulous design of heuristics, introducing a seamless fusion of Structure-from-Motion (SfM)-based heuristics and color residual heuristics, catering to a diverse range of texture profiles. Extensive experiments demonstrate the superiority and robustness of our method in mitigating transient dis-tractors for NeRFs trained in non-static scenes. Project page: https://cnhaox.github.io/NeRF-HuGS/
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Date Type: | Published Online |
Status: | Published |
Schools: | Schools > Computer Science & Informatics |
Publisher: | IEEE |
ISBN: | 979-8-3503-5301-3 |
ISSN: | 1063-6919 |
Date of First Compliant Deposit: | 9 April 2024 |
Date of Acceptance: | 27 February 2024 |
Last Modified: | 08 Apr 2025 14:01 |
URI: | https://orca.cardiff.ac.uk/id/eprint/167524 |
Actions (repository staff only)
![]() |
Edit Item |