HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training

Zewen Chi Li Dong Furu Wei Nan Yang Saksham Singhal Wenhui Wang Xia Song Xian-Ling Mao Heyan Huang Ming Zhou

InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training

Abstract

In this work, we present an information-theoretic framework that formulates cross-lingual language model pre-training as maximizing mutual information between multilingual-multi-granularity texts. The unified view helps us to better understand the existing methods for learning cross-lingual representations. More importantly, inspired by the framework, we propose a new pre-training task based on contrastive learning. Specifically, we regard a bilingual sentence pair as two views of the same meaning and encourage their encoded representations to be more similar than the negative examples. By leveraging both monolingual and parallel corpora, we jointly train the pretext tasks to improve the cross-lingual transferability of pre-trained models. Experimental results on several benchmarks show that our approach achieves considerably better performance. The code and pre-trained models are available at https://aka.ms/infoxlm.

Code Repositories

facebookresearch/data2vec_vision
pytorch
Mentioned in GitHub
jiamingkong/infoxlm_paddle
paddle
Mentioned in GitHub
CZWin32768/xnlg
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
zero-shot-cross-lingual-transfer-on-xtremeT-ULRv2 + StableTune
Avg: 80.7
Question Answering: 72.9
Sentence Retrieval: 89.3
Sentence-pair Classification: 88.8
Structured Prediction: 75.4

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training | Papers | HyperAI