Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Robust knowledge extraction from large language models using social choice theory

Potyka, Nico, Zhu, Yuqicheng, He, Yunjie, Kharlamov, Evgeny and Staab, Steffen 2024. Robust knowledge extraction from large language models using social choice theory. Presented at: The 23rd International Conference on Autonomous Agents and Multi-Agent Systems (AAMAS 2024), Auckland, New Zealand, 6-10 May 2024. Published in: Alechina, N., Dignum, V. and Sichman, J. S. eds. AAMAS '24: Proceedings of the 23rd International Conference on Autonomous Agents and Multiagent Systems. Association for Computing Machinery, 1593–1601. 10.5555/3635637.3663020

[thumbnail of 3635637.3663020.pdf]
Preview
PDF - Published Version
Available under License Creative Commons Attribution.

Download (1MB) | Preview

Abstract

Large-language models (LLMs) can support a wide range of applications like conversational agents, creative writing or general query answering. However, they are ill-suited for query answering in high-stake domains like medicine because they are typically not robust - even the same query can result in different answers when prompted multiple times. In order to improve the robustness of LLM queries, we propose using ranking queries repeatedly and to aggregate the queries using methods from social choice theory. We study ranking queries in diagnostic settings like medical and fault diagnosis and discuss how the Partial Borda Choice function from the literature can be applied to merge multiple query results. We discuss some additional interesting properties in our setting and evaluate the robustness of our approach empirically.

Item Type: Conference or Workshop Item (Paper)
Date Type: Published Online
Status: Published
Schools: Computer Science & Informatics
Publisher: Association for Computing Machinery
Related URLs:
Date of First Compliant Deposit: 15 May 2024
Date of Acceptance: 20 December 2023
Last Modified: 12 Jun 2024 09:01
URI: https://orca.cardiff.ac.uk/id/eprint/168930

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics