IKUN for WMT24 General MT Task: LLMs Are Here for Multilingual Machine Translation

Open Access
Authors
Publication date 2024
Host editors
  • B. Haddow
  • T. Kocmi
  • P. Koehn
  • C. Monz
Book title Ninth Conference on Machine Translation : Proceedings of the Conference
Book subtitle WMT 2024 : November 15-16, 2024
ISBN (electronic)
  • 9798891761797
Event 9th Conference on Machine Translation
Pages (from-to) 263-269
Number of pages 7
Publisher Kerrville, TX: Association for Computational Linguistics
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
This paper introduces two multilingual systems, IKUN and IKUN-C, developed for the general machine translation task in WMT24. IKUN and IKUN-C represent an open system and a constrained system, respectively, built on Llama-3-8b and Mistral-7B-v0.3. Both systems are designed to handle all 11 language directions using a single model. According to automatic evaluation metrics, IKUN-C achieved 6 first-place and 3 second-place finishes among all constrained systems, while IKUN secured 1 first-place and 2 second-place finishes across both open and constrained systems. These encouraging results suggest that large language models (LLMs) are nearing the level of proficiency required for effective multilingual machine translation. The systems are based on a two-stage approach: first, continuous pre-training on monolingual data in 10 languages, followed by fine-tuning on high-quality parallel data for 11 language directions. The primary difference between IKUN and IKUN-C lies in their monolingual pre-training strategy. IKUN-C is pre-trained using constrained monolingual data, whereas IKUN leverages monolingual data from the OSCAR dataset. In the second phase, both systems are fine-tuned on parallel data sourced from NTREX, Flores, and WMT16-23 for all 11 language pairs.
Document type Conference contribution
Language English
Published at https://doi.org/10.48550/arXiv.2408.11512 https://doi.org/10.18653/v1/2024.wmt-1.19
Downloads
2408.11512v2 (Accepted author manuscript)
2024.wmt-1.19 (Final published version)
Permalink to this page
Back