Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

AI failure in the Google Gemini Controversy: synthetic history and epistemic disobedience

Nieto McAvoy, Eva and Kidd, Jenny ORCID: https://orcid.org/0000-0003-0188-2140 2025. AI failure in the Google Gemini Controversy: synthetic history and epistemic disobedience. Annals of the Fondazione Luigi Einaudi
Item availability restricted.

[thumbnail of Synthetic history_article_revised_ANON_2[42].pdf] PDF - Accepted Post-Print Version
Restricted to Repository staff only

Download (244kB)
[thumbnail of Provisional file] PDF (Provisional file) - Accepted Post-Print Version
Download (17kB)

Abstract

In February 2024, images of black vikings and female popes produced with Google’s Generative AI Gemini started circulating on social media. The narrative around them was that fairly ‘neutral’ prompts like ‘founding fathers’ delivered inaccurate racially diverse images. Responses on social media and news outlets ranged from indignant to enthusiastic and Gemini was accused of both trying to rewrite history and praised for challenging algorithmic biases. The result of this ‘error’ was that the image generator was suspended on Gemini, while Google worked on this failure of alignment and apologised for taking overcorrecting long-standing racial bias problems in AI too far. In this article, we explore this media event as an example of the myriad ways in which AI systems intersect with cultural norms, values, and belief systems. We suggest that while this ‘failure’ is representative of the cultural limitations of AI knowledge, it can also be understood as an act of ‘epistemic disobedience’ in ways that might be more productive than myopic. To explore this issue, in this article we present findings from a detailed content and discourse analysis of international reporting on the Gemini incident, reframing it as a site of epistemic and cultural negotiations rather than a mere technical failure. Studying the controversy at the intersection of memory studies, critical algorithmic studies, and media studies opens new avenues for understanding how generative AI reshapes historical representation. We conclude that this epistemic failure/disobedience highlights the fact that our past has always been synthetic.

Item Type: Article
Status: In Press
Schools: Schools > Journalism, Media and Culture
Date of First Compliant Deposit: 18 November 2025
Date of Acceptance: 30 October 2025
Last Modified: 19 Nov 2025 12:15
URI: https://orca.cardiff.ac.uk/id/eprint/182487

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics