The Fine-Tuning Paradox: Boosting Translation Quality Without Sacrificing LLM Abilities
| Authors | |
|---|---|
| Publication date | 2024 |
| Host editors |
|
| Book title | The 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024) : proceedings of the conference |
| Book subtitle | ACL 2024 : August 11-16, 2024 |
| ISBN (electronic) |
|
| Event | 62nd Annual Meeting of the Association for Computational Linguistics, ACL 2024 |
| Volume | Issue number | 1 |
| Pages (from-to) | 6189-6206 |
| Publisher | Kerrville, TX: Association for Computational Linguistics |
| Organisations |
|
| Abstract |
Fine-tuning large language models (LLMs) for machine translation has shown improvements in overall translation quality. However, it is unclear what is the impact of fine-tuning on desirable LLM behaviors that are not present in neural machine translation models, such as steerability, inherent document-level translation abilities, and the ability to produce less literal translations. We perform an extensive translation evaluation on the LLaMA and Falcon family of models with model size ranging from 7 billion up to 65 billion parameters.Our results show that while fine-tuning improves the general translation quality of LLMs, several abilities degrade. In particular, we observe a decline in the ability to perform formality steering, to produce technical translations through few-shot examples, and to perform document-level translation. On the other hand, we observe that the model produces less literal translations after fine-tuning on parallel data. We show that by including monolingual data as part of the fine-tuning data we can maintain the abilities while simultaneously enhancing overall translation quality. Our findings emphasize the need for fine-tuning strategies that preserve the benefits of LLMs for machine translation.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.18653/v1/2024.acl-long.336 |
| Downloads |
2024.acl-long.336
(Final published version)
|
| Permalink to this page | |