Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Distilling semantic concept embeddings from contrastively fine-tuned language models

Li, Na, Kteich, Hanane, Bouraoui, Zied and Schockaert, Steven ORCID: https://orcid.org/0000-0002-9256-2881 2023. Distilling semantic concept embeddings from contrastively fine-tuned language models. Presented at: 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, Taipei, Taiwan, 23-27 July 2023. Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval. p. 216. 10.1145/3539618.3591667

[thumbnail of _SIGIR_2023__BERT_contrastive_properties-4.pdf]
Preview
PDF - Accepted Post-Print Version
Download (882kB) | Preview

Abstract

Learning vectors that capture the meaning of concepts remains a fundamental challenge. Somewhat surprisingly, perhaps, pre-trained language models have thus far only enabled modest improvements to the quality of such concept embeddings. Current strategies for using language models typically represent a concept by averaging the contextualised representations of its mentions in some corpus. This is potentially sub-optimal for at least two reasons. First, contextualised word vectors have an unusual geometry, which hampers downstream tasks. Second, concept embeddings should capture the semantic properties of concepts, whereas contextualised word vectors are also affected by other factors. To address these issues, we propose two contrastive learning strategies, based on the view that whenever two sentences reveal similar properties, the corresponding contextualised vectors should also be similar. One strategy is fully unsupervised, estimating the properties which are expressed in a sentence from the neighbourhood structure of the contextualised word embeddings. The second strategy instead relies on a distant supervision signal from ConceptNet. Our experimental results show that the resulting vectors substantially outperform existing concept embeddings in predicting the semantic properties of concepts, with the ConceptNet-based strategy achieving the best results. These findings are furthermore confirmed in a clustering task and in the downstream task of ontology completion.

Item Type: Conference or Workshop Item (Paper)
Date Type: Publication
Status: Published
Schools: Schools > Computer Science & Informatics
ISBN: 978-1-4503-9408-6
Date of First Compliant Deposit: 6 May 2023
Date of Acceptance: 5 April 2023
Last Modified: 17 Feb 2025 15:33
URI: https://orca.cardiff.ac.uk/id/eprint/159300

Citation Data

Cited 2 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics