Yang, Sheng, Kuang, Zheng-Fei, Cao, Yan-Pei, Lai, Yu-Kun ![]() |
Preview |
PDF
- Accepted Post-Print Version
Download (767kB) | Preview |
Abstract
We present a real-time dense mapping system which uses the predicted 2D semantic labels for optimizing the geometric quality of reconstruction. With a combination of Convolutional Neural Networks (CNNs) for 2D labeling and a Simultaneous Localization and Mapping (SLAM) system for camera trajectory estimation, recent approaches have succeeded in incrementally fusing and labeling 3D scenes. However, the geometric quality of the reconstruction can be further improved by incorporating such semantic prediction results, which is not sufficiently exploited by existing methods. In this paper, we propose to use semantic information to improve two crucial modules in the reconstruction pipeline, namely tracking and loop detection, for obtaining mutual benefits in geometric reconstruction and semantic recognition. Specifically for tracking, we use a novel probabilistic projective association approach to efficiently pick out candidate correspondences, where the confidence of these correspondences is quantified concerning similarities on all available short-term invariant features. For the loop detection, we incorporate these semantic labels into the original encoding through Randomized Ferns to generate a more comprehensive representation for retrieving candidate loop frames. Evaluations on a publicly available synthetic dataset have shown the effectiveness of our approach that considers such semantic hints as a reliable feature for achieving higher geometric quality.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Date Type: | Published Online |
Status: | Published |
Schools: | Schools > Computer Science & Informatics |
Publisher: | IEEE |
ISBN: | 978-1-5386-8176-3 |
Related URLs: | |
Date of First Compliant Deposit: | 4 March 2019 |
Date of Acceptance: | 26 January 2019 |
Last Modified: | 08 Aug 2025 15:00 |
URI: | https://orca.cardiff.ac.uk/id/eprint/120190 |
Citation Data
Cited 8 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
![]() |
Edit Item |