Command Palette
Search for a command to run...
4 months ago
Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement
Jason Lee; Elman Mansimov; Kyunghyun Cho

Abstract
We propose a conditional non-autoregressive neural sequence model based on iterative refinement. The proposed model is designed based on the principles of latent variable models and denoising autoencoders, and is generally applicable to any sequence generation task. We extensively evaluate the proposed model on machine translation (En-De and En-Ro) and image caption generation, and observe that it significantly speeds up decoding while maintaining the generation quality comparable to the autoregressive counterpart.
Code Repositories
zhajiahe/Token_Drop
pytorch
Mentioned in GitHub
nyu-dl/dl4mt-nonauto
Official
pytorch
Mentioned in GitHub
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| machine-translation-on-iwslt2015-english | Denoising autoencoders (non-autoregressive) | BLEU score: 27.01 |
| machine-translation-on-iwslt2015-german | Denoising autoencoders (non-autoregressive) | BLEU score: 32.43 |
| machine-translation-on-wmt2014-english-german | Denoising autoencoders (non-autoregressive) | BLEU score: 21.54 Hardware Burden: Operations per network pass: |
| machine-translation-on-wmt2014-german-english | Denoising autoencoders (non-autoregressive) | BLEU score: 25.43 |
| machine-translation-on-wmt2016-english-1 | Denoising autoencoders (non-autoregressive) | BLEU score: 29.66 |
| machine-translation-on-wmt2016-romanian | Denoising autoencoders (non-autoregressive) | BLEU score: 30.30 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.
AI Co-coding
Ready-to-use GPUs
Best Pricing
Hyper Newsletters
Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp