Li, Yixiao, Yang, Xiaoyuan, Fu, Jun, Yue, Guanghui and Zhou, Wei 2024. Deep bi-directional attention network for image super-resolution quality assessment. Presented at: IEEE International Conference on Multimedia and Expo (ICME), Niagra Falls, Canada, 15-19 July 2024. 2024 IEEE International Conference on Multimedia and Expo (ICME). IEEE, pp. 1-6. 10.1109/ICME57554.2024.10687430 |
Preview |
PDF
- Accepted Post-Print Version
Download (1MB) | Preview |
Abstract
There has emerged a growing interest in exploring efficient quality assessment algorithms for image super-resolution (SR). However, employing deep learning techniques, especially dual-branch algorithms, to automatically evaluate the visual quality of SR images remains challenging. Existing SR image quality assessment (IQA) metrics based on two-stream networks lack interactions between branches. To address this, we propose a novel full-reference IQA (FR-IQA) method for SR images. Specifically, producing SR images and evaluating how close the SR images are to the corresponding HR references are separate processes. Based on this consideration, we construct a deep Bidirectional Attention Network (BiAtten-Net) that dynamically deepens visual attention to distortions in both processes, which aligns well with the human visual system (HVS). Experiments on public SR quality databases demonstrate the superiority of our proposed BiAtten-Net over state-of-the-art quality assessment methods. In addition, the visualization results and ablation study show the effectiveness of bi-directional attention.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Date Type: | Published Online |
Status: | Published |
Schools: | Computer Science & Informatics |
Publisher: | IEEE |
ISBN: | 979-8-3503-9015-5 |
Date of First Compliant Deposit: | 23 July 2024 |
Date of Acceptance: | 13 March 2024 |
Last Modified: | 18 Oct 2024 09:53 |
URI: | https://orca.cardiff.ac.uk/id/eprint/170877 |
Actions (repository staff only)
Edit Item |