HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction

Maha Elbayad; Laurent Besacier; Jakob Verbeek

Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction

Abstract

Current state-of-the-art machine translation systems are based on encoder-decoder architectures, that first encode the input sequence, and then generate an output sequence based on the input encoding. Both are interfaced with an attention mechanism that recombines a fixed encoding of the source tokens based on the decoder state. We propose an alternative approach which instead relies on a single 2D convolutional neural network across both sequences. Each layer of our network re-codes source tokens on the basis of the output sequence produced so far. Attention-like properties are therefore pervasive throughout the network. Our model yields excellent results, outperforming state-of-the-art encoder-decoder systems, while being conceptually simpler and having fewer parameters.

Code Repositories

elbayadm/attn2d
Official
pytorch
Mentioned in GitHub
tdiggelm/nn-experiments
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-iwslt2015-englishPervasive Attention
BLEU score: 27.99
machine-translation-on-iwslt2015-germanPervasive Attention
BLEU score: 34.18

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp