Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Towards an integrated evaluation framework for xai: an experimental study

Zhang, Qiyuan, Hall, Mark, Johansen, Mark ORCID:, Galetic, Vedran, Grange, Jacques ORCID:, Quintana-Amate, Santiago, Nottle, Alistair, Jones, Dylan M and Morgan, Phillip L ORCID: 2022. Towards an integrated evaluation framework for xai: an experimental study. Procedia Computer Science 207 , pp. 3884-3893. 10.1016/j.procs.2022.09.450

[thumbnail of 1-s2.0-S1877050922013436-main.pdf] PDF - Published Version
Available under License Creative Commons Attribution.

Download (954kB)


Increasing prevalence of opaque black-box AI has highlighted the need for explanations of their behaviours, for example, via explanation artefacts/proxy models. The current paper presents a paradigm for human-grounded experiments to evaluate the relationship between explanation fidelity, human learning performance, understanding and trust in a black-box AI by manipulating the complexity of an explanatory artefact. Decision trees were used in the current experiment as exemplar interpretable surrogate models, providing explanations approximating black-box behaviour, by means of explanation by simplification. Consistent with our hypotheses: 1) explanatory artefacts brought about better learning, while greater decision tree depths led to greater interpretability of the AI's performance and greater trust in the AI; and 2) explanatory artefacts facilitated learning and task performance even after they were withdrawn. Findings are discussed in terms of the interplay between human understanding, trust and AI system performance, highlighting the simplifying assumption of a monotonic relationship between explanation fidelity and interpretability.

Item Type: Article
Date Type: Published Online
Status: Published
Schools: Psychology
Publisher: Elsevier
ISSN: 1877-0509
Funders: Airbus Ltd.
Date of First Compliant Deposit: 20 December 2022
Last Modified: 04 May 2023 15:26

Actions (repository staff only)

Edit Item Edit Item


Downloads per month over past year

View more statistics