Zhang, Gaifan, Zhou, Yi ORCID: https://orcid.org/0000-0001-7009-8515 and Bollegala, Danushka 2024. Evaluating unsupervised dimensionality reduction methods for pretrained sentence embeddings. Presented at: The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), Turin, Italy, 20-25 May 2024. Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024). Turin, Italy: ELRA and ICCL, pp. 6530-6543. |
Preview |
PDF
- Published Version
Available under License Creative Commons Attribution Non-commercial. Download (2MB) | Preview |
Abstract
Sentence embeddings produced by Pretrained Language Models (PLMs) have received wide attention from the NLP community due to their superior performance when representing texts in numerous downstream applications. However, the high dimensionality of the sentence embeddings produced by PLMs is problematic when representing large numbers of sentences in memory- or compute-constrained devices. As a solution, we evaluate unsupervised dimensionality reduction methods to reduce the dimensionality of sentence embeddings produced by PLMs. Our experimental results show that simple methods such as Principal Component Analysis (PCA) can reduce the dimensionality of sentence embeddings by almost 50%, without incurring a significant loss in performance in multiple downstream tasks. Surprisingly, reducing the dimensionality further improves performance over the original high dimensional versions for the sentence embeddings produced by some PLMs in some tasks.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Computer Science & Informatics |
Publisher: | ELRA and ICCL |
Date of First Compliant Deposit: | 20 November 2024 |
Date of Acceptance: | 2024 |
Last Modified: | 26 Nov 2024 16:16 |
URI: | https://orca.cardiff.ac.uk/id/eprint/173670 |
Actions (repository staff only)
Edit Item |