Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

When customers know It’s AI: Experimental comparison of human and LLM-based communication in service recovery

Hao, Xinyue, Dong, Dapeng, Zhang, Yuxing and Demir, Emrah ORCID: https://orcid.org/0000-0002-4726-2556 2025. When customers know It’s AI: Experimental comparison of human and LLM-based communication in service recovery. Journal of Marketing Communications 10.1080/13527266.2025.2540376
Item availability restricted.

[thumbnail of Manuscript.pdf] PDF - Accepted Post-Print Version
Restricted to Repository staff only
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (2MB)
[thumbnail of Provisional file] PDF (Provisional file) - Accepted Post-Print Version
Download (17kB)

Abstract

As generative AI (GAI) becomes increasingly integrated into customer service platforms, its ability to simulate human language raises new relational expectations, particularly in emotionally sensitive interactions. This study investigates how emotional intensity and identity disclosure shape user perceptions of GAI-authored service recovery messages. In a controlled experiment within the online food delivery context, participants evaluated identical service responses across two emotional conditions (routine vs. emotionally charged) and two identity conditions (AI vs. human, disclosed vs. undisclosed). Results reveal that while GAI is perceived as competent in lowemotion scenarios, its human-like language triggers negative reactions under high-emotion conditions, especially after its identity is disclosed. Users interpret simulated empathy as inauthentic, leading to what we term identity-contingent trust violations. Furthermore, participants with higher GAI familiarity were more critical, demonstrating a pattern of critical familiarity, where technical literacy heightens relational expectations. This study advances theories of human– AI interaction by integrating emotional context and identity perception into models of trust calibration. Practically, it highlights the need for role-appropriate GAI deployment and emotionally aware interaction design, where AI systems are matched to context-sensitive tasks and clearly framed as assistants, not surrogates, in situations requiring genuine emotional care.

Item Type: Article
Status: In Press
Schools: Schools > Business (Including Economics)
Publisher: Taylor and Francis Group
ISSN: 1466-4445
Date of First Compliant Deposit: 25 July 2025
Date of Acceptance: 24 July 2025
Last Modified: 28 Jul 2025 11:45
URI: https://orca.cardiff.ac.uk/id/eprint/180047

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics