Bouraoui, Zied, Camacho Collados, Jose ORCID: https://orcid.org/0000-0003-1618-7239 and Schockaert, Steven ORCID: https://orcid.org/0000-0002-9256-2881 2020. Inducing relational knowledge from BERT. Presented at: Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20), New York, NY, USA, 7-12 February 2020. Proceedings of the AAAI Conference on Artificial Intelligence. , vol.34 (5) pp. 7456-7463. 10.1609/aaai.v34i05.6242 |
Preview |
PDF
- Accepted Post-Print Version
Download (186kB) | Preview |
Abstract
One of the most remarkable properties of word embeddings is the fact that they capture certain types of semantic and syntactic relationships. Recently, pre-trained language models such as BERT have achieved groundbreaking results across a wide range of Natural Language Processing tasks. However, it is unclear to what extent such models capture relational knowledge beyond what is already captured by standard word embeddings. To explore this question, we propose a methodology for distilling relational knowledge from a pre-trained language model. Starting from a few seed instances of a given relation, we first use a large text corpus to find sentences that are likely to express this relation. We then use a subset of these extracted sentences as templates. Finally, we fine-tune a language model to predict whether a given word pair is likely to be an instance of some relation, when given an instantiated template for that relation as input.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Computer Science & Informatics |
ISBN: | 9781577358350 |
ISSN: | 2159-5399 |
Date of First Compliant Deposit: | 26 March 2020 |
Date of Acceptance: | 12 February 2020 |
Last Modified: | 25 Nov 2022 11:58 |
URI: | https://orca.cardiff.ac.uk/id/eprint/127433 |
Actions (repository staff only)
Edit Item |