Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Fine-grained attention and feature-sharing generative adversarial networks for single image super-resolution

Yan, Yitong, Liu, Chuangchuang, Chen, Changyou, Sun, Xianfang ORCID: https://orcid.org/0000-0002-6114-0766, Jin, Longcun, Xinyi, Peng and Zhou, Xiang 2022. Fine-grained attention and feature-sharing generative adversarial networks for single image super-resolution. IEEE Transactions on Multimedia 24 , pp. 1473-1487. 10.1109/TMM.2021.3065731

Full text not available from this repository.

Abstract

Traditional super-resolution (SR) methods by minimize the mean square error usually produce images with oversmoothed and blurry edges, due to the lack of high-frequency details. In this paper, we propose two novel techniques within the generative adversarial network framework to encourage generation of photo-realistic images for image super-resolution. Firstly, instead of producing a single score to discriminate real and fake images, we propose a variant, called Fine-grained Attention Generative Adversarial Network (FASRGAN), to discriminate each pixel of real and fake images. FASRGAN adopts a UNetlike network as the discriminator with two outputs: an image score and an image score map. The score map has the same spatial size as the HR/SR images, serving as the fine-grained attention to represent the degree of reconstruction difficulty for each pixel. Secondly, instead of using different networks for the generator and the discriminator, we introduce a feature-sharing variant (denoted as Fs-SRGAN) for both the generator and the discriminator. The sharing mechanism can maintain model express power while making the model more compact, and thus can improve the ability of producing high-quality images. Quantitative and visual comparisons with state-of-the-art methods on benchmark datasets demonstrate the superiority of our methods. We further apply our super-resolution images for object recognition, which further demonstrates the effectiveness of our proposed method. The code is available at https://github.com/Rainyfish/FASRGAN-and-Fs-SRGAN.

Item Type: Article
Date Type: Publication
Status: Published
Schools: Computer Science & Informatics
Publisher: Institute of Electrical and Electronics Engineers
ISSN: 1520-9210
Date of First Compliant Deposit: 25 January 2022
Date of Acceptance: 7 March 2021
Last Modified: 10 Nov 2022 10:28
URI: https://orca.cardiff.ac.uk/id/eprint/146923

Citation Data

Cited 3 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item