Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Deep feature importance awareness based no-reference image quality prediction

Yang, Xiaohan, Li, Fan and Liu, Hantao ORCID: https://orcid.org/0000-0003-4544-3481 2020. Deep feature importance awareness based no-reference image quality prediction. Neurocomputing 401 , pp. 209-223. 10.1016/j.neucom.2020.03.072

Full text not available from this repository.

Abstract

Deep-learning based image quality assessment (IQA) algorithms usually use the transfer learning method that transfers a pre-trained network for classification task to handle IQA task. Although it can overcome the problem of having insufficient IQA databases to some extent, it cannot distinguish between the important and unimportant deep features for the IQA task, which potentially leads to inaccurate prediction performance. In this paper, we propose a no-reference IQA method based on modelling of deep feature importance. A SE-VGG network is developed by using adaptive transfer learning method. It can suppress the features of local parts of salient objects of images that are not important to the IQA task, and emphasize the features of image distortion and salient objects that are important to IQA task. Moreover, the structure of the SE-VGG is investigated to improve the accuracy of the image quality assessment on a small IQA database. Experiments are conducted to evaluate the performance of the proposed method on various databases, including the LIVE, TID2013, CSIQ, LIVE multiply distorted and LIVE challenge. The results show the proposed method significantly outperforms the state-of-the-art methods. In addition, our method demonstrates a strong generalization ability.

Item Type: Article
Date Type: Publication
Status: Published
Schools: Computer Science & Informatics
Publisher: Elsevier
ISSN: 0925-2312
Date of Acceptance: 25 March 2020
Last Modified: 07 Nov 2022 10:24
URI: https://orca.cardiff.ac.uk/id/eprint/132136

Citation Data

Cited 16 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item