HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling

Shruti Bhosale Kyra Yee Sergey Edunov Michael Auli

Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling

Abstract

Pre-training models on vast quantities of unlabeled data has emerged as an effective approach to improving accuracy on many NLP tasks. On the other hand, traditional machine translation has a long history of leveraging unlabeled data through noisy channel modeling. The same idea has recently been shown to achieve strong improvements for neural machine translation. Unfortunately, naïve noisy channel modeling with modern sequence to sequence models is up to an order of magnitude slower than alternatives. We address this issue by introducing efficient approximations to make inference with the noisy channel approach as fast as strong ensembles while increasing accuracy. We also show that the noisy channel approach can outperform strong pre-training results by achieving a new state of the art on WMT Romanian-English translation.

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-wmt2016-romanianfast-noisy-channel-modeling
BLEU score: 40.3

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp