Chatterjee, Usashi, Gajbhiye, Amit and Schockaert, Steven ORCID: https://orcid.org/0000-0002-9256-2881 2023. Cabbage sweeter than cake? Analysing the potential of large language models for learning conceptual spaces. Presented at: Conference on Empirical Methods in Natural Language Processing, EMNLP, Singapore, 6-10 December 2023. Published in: Bouamor, Houda, Pino, Juan and Bali, Kalika eds. Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 11836–11842. 10.18653/v1/2023.emnlp-main.725 |
Abstract
The theory of Conceptual Spaces is an influential cognitive-linguistic framework for representing the meaning of concepts. Conceptual spaces are constructed from a set of quality dimensions, which essentially correspond to primitive perceptual features (e.g. hue or size). These quality dimensions are usually learned from human judgements, which means that applications of conceptual spaces tend to be limited to narrow domains (e.g. modelling colour or taste). Encouraged by recent findings about the ability of Large Language Models (LLMs) to learn perceptually grounded representations, we explore the potential of such models for learning conceptual spaces. Our experiments show that LLMs can indeed be used for learning meaningful representations to some extent. However, we also find that fine-tuned models of the BERT family are able to match or even outperform the largest GPT-3 model, despite being 2 to 3 orders of magnitude smaller.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Status: | Published |
Schools: | Advanced Research Computing @ Cardiff (ARCCA) Computer Science & Informatics |
Publisher: | Association for Computational Linguistics |
Related URLs: | |
Date of Acceptance: | 7 October 2023 |
Last Modified: | 10 Jun 2024 09:03 |
URI: | https://orca.cardiff.ac.uk/id/eprint/165645 |
Actions (repository staff only)
Edit Item |