Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

BERT is to NLP what AlexNet is to CV: can pre-trained language models identify analogies?

Ushio, Asahi, Espinosa-Anke, Luis ORCID: https://orcid.org/0000-0001-6830-9176, Schockaert, Steven ORCID: https://orcid.org/0000-0002-9256-2881 and Camacho Collados, Jose ORCID: https://orcid.org/0000-0003-1618-7239 2021. BERT is to NLP what AlexNet is to CV: can pre-trained language models identify analogies? Presented at: 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021), Bangkok, Thailand, 1-6 August 2021.

[thumbnail of ACL_2021_Analogies_LMs.pdf]
Preview
PDF - Accepted Post-Print Version
Download (670kB) | Preview
Item Type: Conference or Workshop Item (Paper)
Date Type: Completion
Status: Unpublished
Schools: Advanced Research Computing @ Cardiff (ARCCA)
Computer Science & Informatics
Related URLs:
Date of First Compliant Deposit: 4 June 2021
Last Modified: 14 Jun 2024 15:21
URI: https://orca.cardiff.ac.uk/id/eprint/141729

Citation Data

Cited 6 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics