Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Modelling commonsense properties using pre-trained bi-encoders

Gajbhiye, Amit, Espinosa-Anke, Luis ORCID: and Schockaert, Steven ORCID: 2022. Modelling commonsense properties using pre-trained bi-encoders. Presented at: 29th International Conference on Computational Linguistics (COLING), 12-17 October 2022. Proceedings of the 29th International Conference on Computational Linguistics. International Committee on Computational Linguistics, pp. 3971-3983.

[thumbnail of COLING_2022___ELEXIR-4.pdf]
PDF - Accepted Post-Print Version
Download (261kB) | Preview


Grasping the commonsense properties of everyday concepts is an important prerequisite to language understanding. While contextualised language models are reportedly capable of predicting such commonsense properties with human-level accuracy, we argue that such results have been inflated because of the high similarity between training and test concepts. This means that models which capture concept similarity can perform well, even if they do not capture any knowledge of the commonsense properties themselves. In settings where there is no overlap between the properties that are considered during training and testing, we find that the empirical performance of standard language models drops dramatically. To address this, we study the possibility of fine-tuning language models to explicitly model concepts and their properties. In particular, we train separate concept and property encoders on two types of readily available data: extracted hyponym-hypernym pairs and generic sentences. Our experimental results show that the resulting encoders allow us to predict commonsense properties with much higher accuracy than is possible by directly fine-tuning language models. We also present experimental results for the related task of unsupervised hypernym discovery.

Item Type: Conference or Workshop Item (Paper)
Date Type: Publication
Status: Published
Schools: Computer Science & Informatics
Publisher: International Committee on Computational Linguistics
Date of First Compliant Deposit: 16 September 2022
Date of Acceptance: 18 August 2022
Last Modified: 30 Nov 2022 08:47

Actions (repository staff only)

Edit Item Edit Item


Downloads per month over past year

View more statistics