HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Alleviating the Inequality of Attention Heads for Neural Machine Translation

Zewei Sun Shujian Huang Xin-Yu Dai Jiajun Chen

Alleviating the Inequality of Attention Heads for Neural Machine Translation

Abstract

Recent studies show that the attention heads in Transformer are not equal. We relate this phenomenon to the imbalance training of multi-head attention and the model dependence on specific heads. To tackle this problem, we propose a simple masking method: HeadMask, in two specific ways. Experiments show that translation improvements are achieved on multiple language pairs. Subsequent empirical analyses also support our assumption and confirm the effectiveness of the method.

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-iwslt2015-vietnameseHeadMask (Random-18)
BLEU: 26.85
machine-translation-on-iwslt2015-vietnameseHeadMask (Impt-18)
BLEU: 26.36
machine-translation-on-wmt2016-romanianHeadMask (Random-18)
BLEU score: 32.85
machine-translation-on-wmt2016-romanianHeadMask (Impt-18)
BLEU score: 32.95
machine-translation-on-wmt2017-turkishHeadMask (Impt-18)
BLEU score: 17.48
machine-translation-on-wmt2017-turkishHeadMask (Random-18)
BLEU score: 17.56

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp