Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Deep edge map guided depth super resolution

Jiang, Zhongyu, Yue, Huanjing, Lai, Yu-Kun ORCID:, Yang, Jingyu, Hou, Yonghong and Hou, Chunping 2021. Deep edge map guided depth super resolution. Signal Processing: Image Communication 90 , 116040. 10.1016/j.image.2020.116040

[thumbnail of EdgeSR_SPIC.pdf]
PDF - Accepted Post-Print Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (16MB) | Preview


Accurate edge reconstruction is critical for depth map super resolution (SR). Therefore, many traditional SR methods utilize edge maps to guide depth SR. However, it is difficult to predict accurate edge maps from low resolution (LR) depth maps. In this paper, we propose a deep edge map guided depth SR method, which includes an edge prediction subnetwork and an SR subnetwork. The edge prediction subnetwork takes advantage of the hierarchical representation of color and depth images to produce accurate edge maps, which promote the performance of SR subnetwork. The SR subnetwork is a disentangling cascaded network to progressively upsample SR result, where every level is made up of a weight sharing module and an adaptive module. The weight sharing module extracts the general features in different levels, while the adaptive module transfers the general features to the specific features to adapt to different degraded inputs. Quantitative and qualitative evaluations on various datasets with different magnification factors demonstrate the effectiveness and promising performance of the proposed method. In addition, we construct a benchmark dataset captured by Kinect-v2 to facilitate research on real-world depth map SR.

Item Type: Article
Date Type: Publication
Status: Published
Schools: Computer Science & Informatics
Publisher: Elsevier
ISSN: 0923-5965
Date of First Compliant Deposit: 4 November 2020
Date of Acceptance: 13 October 2020
Last Modified: 07 Nov 2023 01:45

Citation Data

Cited 7 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item


Downloads per month over past year

View more statistics