Abstract

Post-editing machine translation has become more common in recent years due to the increase in materials requiring translation and the effectiveness of machine translation systems. This project presents a system for formalizing structured translation specifications that facilitates the assessment of the performance of a post-editor. This report provides details concerning two software applications: the Ruqual Specifications Writer, which aids in the authoring of post-editing project specifications, and the Ruqual Rubric Viewer which provides a graphical user interface for filling out a machine readable rubric file. The project as a whole relies on a definition of translation quality based on the specification approach. In order to test whether potential evaluators are able to reliably assess the quality of post-edited translations, a user study was conducted that utilized the Specifications Writer and Rubric Viewer. The specifications developed for the project were based on actual post-editing data provided by Ray Flournoy of Adobe. The study involved simulating the work of five post-editors, which consisted of developing texts and scenarios. 17 non-expert graders rated the work of the five fictional post-editors, and an Intraclass Correlation of the graders responses shows that they are reliable to a high degree. The groundwork laid by this project should help in the development of other applications that assist in the assessment of translation projects in terms of a specification approach to quality.

Degree

MA

College and Department

Humanities; Linguistics and English Language

Rights

http://lib.byu.edu/about/copyright/

Date Submitted

2012-05-25

Document Type

Selected Project

Handle

http://hdl.lib.byu.edu/1877/etd5256

Keywords

post-editing, quality, translation, specifications, Java, rubric, assessment

Language

English

Included in

Linguistics Commons

Share

COinS