HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Linguistic Input Features Improve Neural Machine Translation

Rico Sennrich; Barry Haddow

Linguistic Input Features Improve Neural Machine Translation

Abstract

Neural machine translation has recently achieved impressive results, while using little in the way of external linguistic information. In this paper we show that the strong learning capability of neural MT models does not make linguistic features redundant; they can be easily incorporated to provide further improvements in performance. We generalize the embedding layer of the encoder in the attentional encoder--decoder architecture to support the inclusion of arbitrary features, in addition to the baseline word feature. We add morphological features, part-of-speech tags, and syntactic dependency labels as input features to English<->German, and English->Romanian neural machine translation systems. In experiments on WMT16 training and test sets, we find that linguistic input features improve model quality according to three metrics: perplexity, BLEU and CHRF3. An open-source implementation of our neural MT system is available, as are sample files and configurations.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-wmt2016-english-germanLinguistic Input Features
BLEU score: 28.4
machine-translation-on-wmt2016-german-englishLinguistic Input Features
BLEU score: 32.9

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Linguistic Input Features Improve Neural Machine Translation | Papers | HyperAI