Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Unsupervised Evaluation of Human Translation Quality

Bollegala, Danushka and Zhou, Yi ORCID: https://orcid.org/0000-0001-7009-8515 2019. Unsupervised Evaluation of Human Translation Quality. Presented at: 11th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management, Vienna, Austria, 17 - 19 September 2019. Published in: Fred, Ana and Filipe, Joaquim eds. Proceedings of the 11th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management IC3K. , vol.1 SciTePress, pp. 55-64. 10.5220/0008064500550064

Full text not available from this repository.

Abstract

Even though machine translation (MT) systems have reached impressive performances in cross-lingual translation tasks, the quality of MT is still far behind professional human translations (HTs) due to the complexity in natural languages, especially for terminologies in different domains. Therefore, HTs are still widely demanded in practice. However, the quality of HT is also imperfect and vary significantly depending on the experience and knowledge of the translators. Evaluating the quality of HT in an automatic manner has faced many challenges. Although bilingual speakers are able to assess the translation quality, manually checking the accuracy of translations is expensive and time-consuming. In this paper, we propose an unsupervised method to evaluate the quality of HT without requiring any labelled data. We compare a range of methods for automatically grading HTs and observe the Bidirectional Minimum Word Mover’s distance (BiMWMD) to produce gradings that correlate well with huma ns.

Item Type: Conference or Workshop Item (Paper)
Date Type: Published Online
Status: Published
Schools: Computer Science & Informatics
Publisher: SciTePress
ISBN: 9789897583827
Last Modified: 30 Jul 2024 12:00
URI: https://orca.cardiff.ac.uk/id/eprint/170402

Actions (repository staff only)

Edit Item Edit Item