HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

A Convolutional Encoder Model for Neural Machine Translation

Jonas Gehring; Michael Auli; David Grangier; Yann N. Dauphin

A Convolutional Encoder Model for Neural Machine Translation

Abstract

The prevalent approach to neural machine translation relies on bi-directional LSTMs to encode the source sentence. In this paper we present a faster and simpler architecture based on a succession of convolutional layers. This allows to encode the entire source sentence simultaneously compared to recurrent networks for which computation is constrained by temporal dependencies. On WMT'16 English-Romanian translation we achieve competitive accuracy to the state-of-the-art and we outperform several recently published results on the WMT'15 English-German task. Our models obtain almost the same accuracy as a very deep LSTM setup on WMT'14 English-French translation. Our convolutional encoder speeds up CPU decoding by more than two times at the same or higher accuracy as a strong bi-directional LSTM baseline.

Code Repositories

facebookresearch/fairseq
pytorch
Mentioned in GitHub
siyuofzhou/CNNSeqToSeq
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-iwslt2015-germanConv-LSTM (deep+pos)
BLEU score: 30.4
machine-translation-on-wmt2014-english-frenchDeep Convolutional Encoder; single-layer decoder
BLEU score: 35.7
machine-translation-on-wmt2016-english-1Deep Convolutional Encoder; single-layer decoder
BLEU score: 27.8
machine-translation-on-wmt2016-english-1BiLSTM
BLEU score: 27.5

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
A Convolutional Encoder Model for Neural Machine Translation | Papers | HyperAI