Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

BLESS: Benchmarking Large Language Models on Sentence Simplification

Kew, Tannon, Chi, Alison, Vásquez-Rodríguez, Laura, Agrawal, Sweta, Aumiller, Dennis, Alva Manchego, Fernando and Shardlow, Matthew 2023. BLESS: Benchmarking Large Language Models on Sentence Simplification. Presented at: 2023 Conference on Empirical Methods in Natural Language Processing, 6-10 December 2023. Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing. ACL, 13291–13309. 10.18653/v1/2023.emnlp-main.821

Full text not available from this repository.

Abstract

We present BLESS, a comprehensive performance benchmark of the most recent state-of-the-art Large Language Models (LLMs) on the task of text simplification (TS). We examine how well off-the-shelf LLMs can solve this challenging task, assessing a total of 44 models, differing in size, architecture, pre-training methods, and accessibility, on three test sets from different domains (Wikipedia, news, and medical) under a few-shot setting. Our analysis considers a suite of automatic metrics, as well as a large-scale quantitative investigation into the types of common edit operations performed by the different models. Furthermore, we perform a manual qualitative analysis on a subset of model outputs to better gauge the quality of the generated simplifications. Our evaluation indicates that the best LLMs, despite not being trained on TS perform comparably with state-of-the-art TS baselines. Additionally, we find that certain LLMs demonstrate a greater range and diversity of edit operations. Our performance benchmark will be available as a resource for the development of future TS methods and evaluation metrics.

Item Type: Conference or Workshop Item (Paper)
Date Type: Publication
Status: Published
Schools: Computer Science & Informatics
Publisher: ACL
Last Modified: 16 Feb 2024 16:00
URI: https://orca.cardiff.ac.uk/id/eprint/166103

Actions (repository staff only)

Edit Item Edit Item