Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Deep learning for quality assessment of echocardiographic images

Kang, Shuping, Hicks, Yulia ORCID: https://orcid.org/0000-0002-7179-4587 and Setchi, Rossitza ORCID: https://orcid.org/0000-0002-7207-6544 2025. Deep learning for quality assessment of echocardiographic images. Procedia Computer Science 270 , pp. 4917-4926. 10.1016/j.procs.2025.09.618

[thumbnail of Kang Hicks Setchi.pdf]
Preview
PDF - Published Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (916kB) | Preview

Abstract

The growing need for standardised and automated cardiac ultrasound (US) acquisition has driven the integration of deep learning into echocardiographic workflows. While existing deep learning (DL) models have shown promising results in tasks such as view classification and image quality assessment, most of these approaches focus either on differentiating among standard views or grading image quality within a standard view. However, these methods lack the capacity to model the sequential spatial transitions that occur during the acquisition process, limiting their applicability to real-time probe guidance and robotic control. To address this gap, we propose a classification framework designed for the process of acquiring the parasternal long-axis (PLAX) view under a fixed scanning protocol. Based on extensive probe movement experiments across multiple patients, we identified four representative echocardiographic views that appear during the search for the optimal PLAX position. These views correspond to distinct probe positions and orientations and reflect varying levels of image completeness. A dataset of 7,200 annotated images was used to train a ResNet50-based deep network for multi-class classification. The model achieved robust performance with accuracy, sensitivity, specificity, and F1 scores above 89%, and AUC exceeding 97% in patient-level cross-validation. It effectively captures spatially relevant features, distinguishes subtle view differences, and generalizes well to unseen data. The outputs provide interpretable feedback correlating image quality with probe position, enabling real-time scanning assessment. Furthermore, this work introduces a novel problem formulation and multi-class view classification under a fixed acquisition protocol. It provides a foundation for developing the next generation of intelligent US systems. By linking image classification to probe position and orientation, the proposed framework enables real-time feedback that can ultimately support autonomous scanning agents in locating diagnostically optimal cardiac views.

Item Type: Article
Date Type: Published Online
Status: Published
Schools: Schools > Engineering
Publisher: Elsevier
ISSN: 1877-0509
Date of First Compliant Deposit: 9 November 2025
Date of Acceptance: 9 November 2025
Last Modified: 10 Nov 2025 10:20
URI: https://orca.cardiff.ac.uk/id/eprint/182237

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics