Jameel, Mohammad ORCID: https://orcid.org/0000-0002-3707-4367, Bouraoui, Zied and Schockaert, Steven ORCID: https://orcid.org/0000-0002-9256-2881 2018. Unsupervised learning of distributional relation vectors. Presented at: 56th Annual Meeting of the Association for Computational Linguistics, Melbourne, Australia, 15-20 July 2018. |
Preview |
PDF
- Accepted Post-Print Version
Download (1MB) | Preview |
Abstract
Word embedding models such as GloVe rely on co-occurrence statistics to learn vector representations of word meaning. While we may similarly expect that cooccurrence statistics can be used to capture rich information about the relationships between different words, existing approaches for modeling such relationships are based on manipulating pre-trained word vectors. In this paper, we introduce a novel method which directly learns relation vectors from co-occurrence statistics. To this end, we first introduce a variant of GloVe, in which there is an explicit connection between word vectors and PMI weighted co-occurrence vectors. We then show how relation vectors can be naturally embedded into the resulting vector space
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Date Type: | Completion |
Status: | Unpublished |
Schools: | Advanced Research Computing @ Cardiff (ARCCA) Computer Science & Informatics |
Date of First Compliant Deposit: | 18 July 2018 |
Last Modified: | 14 Jun 2024 15:38 |
URI: | https://orca.cardiff.ac.uk/id/eprint/112687 |
Citation Data
Cited 19 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
Edit Item |