HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Towards Neural Phrase-based Machine Translation

Po-Sen Huang; Chong Wang; Sitao Huang; Dengyong Zhou; Li Deng

Towards Neural Phrase-based Machine Translation

Abstract

In this paper, we present Neural Phrase-based Machine Translation (NPMT). Our method explicitly models the phrase structures in output sequences using Sleep-WAke Networks (SWAN), a recently proposed segmentation-based sequence modeling method. To mitigate the monotonic alignment requirement of SWAN, we introduce a new layer to perform (soft) local reordering of input sequences. Different from existing neural machine translation (NMT) approaches, NPMT does not use attention-based decoding mechanisms. Instead, it directly outputs phrases in a sequential order and can decode in linear time. Our experiments show that NPMT achieves superior performances on IWSLT 2014 German-English/English-German and IWSLT 2015 English-Vietnamese machine translation tasks compared with strong NMT baselines. We also observe that our method produces meaningful phrases in output languages.

Code Repositories

Microsoft/NPMT
pytorch
Mentioned in GitHub
ykrmm/ICLR_2020
pytorch
Mentioned in GitHub
ykrmm/TREMBA
pytorch
Mentioned in GitHub
posenhuang/NPMT
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-iwslt2014-germanNeural PBMT + LM [Huang2018]
BLEU score: 30.08
machine-translation-on-iwslt2015-englishNPMT + language model
BLEU score: 25.36
machine-translation-on-iwslt2015-germanNPMT + language model
BLEU score: 30.08

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp