Abstract

This thesis investigates whether domain-specific fine-tuning of a large multilingual neural machine translation model improves translation quality for English-to-Georgian religious text, a low-resource pair underserved by existing systems. Using the 1.3-billion-parameter NLLB-200 model as a baseline, two LoRA-adapted variants were trained: General-FT, on 91,143 parallel segments of contemporary religious discourse drawn from the General Conference talks and other publications of The Church of Jesus Christ of Latter-day Saints, and Scriptural-FT, on 33,229 segments from the scriptural canon of The Church. The three models were evaluated using automated metrics (BLEU, chrF, COMET) covering both domain-matched and cross-domain conditions, and on a human-evaluation protocol structured by the Multidimensional Quality Metrics (MQM) framework by a panel of three professional translators from The Church of Jesus Christ of Latter-day Saints. The results strongly support the hypothesis that domain-specific fine-tuning improves quality. General-FT achieved a 29.73-point BLEU gain on general religious text and a 39.96% reduction in human-annotated errors on scriptural text. Scriptural-FT reduced register errors from 125 to 1, a 99.2% reduction, and produced the lowest major-error rate on scriptural text, while out-of-domain evaluation on FLORES-200 showed that neither fine-tuned model incurred catastrophic forgetting. At the same time, mistranslation (44%-61% of remaining errors) and grammar errors (31%-40%) persisted, indicating that domain adaptation produces domain-appropriate form without guaranteeing domain-appropriate content. The study closes by situating these empirical results within the hermeneutic demands of sacred-text translation.

Degree

MA

College and Department

Humanities; Linguistics

Rights

https://lib.byu.edu/about/copyright/

Date Submitted

2026-04-22

Document Type

Thesis

Keywords

neural machine translation, domain adaptation, low-resource languages, Georgian, religious text translation, NLLB-200, LoRA fine-tuning, MQM human evaluation, translation register, BLEU, chrF, COMET, sacred-text translation

Language

english

Share

COinS