HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Pre-training Multilingual Neural Machine Translation by Leveraging Alignment Information

Zehui Lin Xiao Pan Mingxuan Wang Xipeng Qiu Jiangtao Feng Hao Zhou Lei Li

Pre-training Multilingual Neural Machine Translation by Leveraging Alignment Information

Abstract

We investigate the following question for machine translation (MT): can we develop a single universal MT model to serve as the common seed and obtain derivative and improved models on arbitrary language pairs? We propose mRASP, an approach to pre-train a universal multilingual neural machine translation model. Our key idea in mRASP is its novel technique of random aligned substitution, which brings words and phrases with similar meanings across multiple languages closer in the representation space. We pre-train a mRASP model on 32 language pairs jointly with only public datasets. The model is then fine-tuned on downstream language pairs to obtain specialized MT models. We carry out extensive experiments on 42 translation directions across a diverse settings, including low, medium, rich resource, and as well as transferring to exotic language pairs. Experimental results demonstrate that mRASP achieves significant performance improvement compared to directly training on those target pairs. It is the first time to verify that multiple low-resource language pairs can be utilized to improve rich resource MT. Surprisingly, mRASP is even able to improve the translation quality on exotic languages that never occur in the pre-training corpus. Code, data, and pre-trained models are available at https://github.com/linzehui/mRASP.

Code Repositories

linzehui/mRASP
Official
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-wmt2014-english-frenchmRASP+Fine-Tune
BLEU score: 44.3
Hardware Burden:
Operations per network pass:
SacreBLEU: 41.7

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Pre-training Multilingual Neural Machine Translation by Leveraging Alignment Information | Papers | HyperAI