HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Hierarchical Learning for Generation with Long Source Sequences

Tobias Rohde Xiaoxia Wu Yinhan Liu

Hierarchical Learning for Generation with Long Source Sequences

Abstract

One of the challenges for current sequence to sequence (seq2seq) models is processing long sequences, such as those in summarization and document level machine translation tasks. These tasks require the model to reason at the token level as well as the sentence and paragraph level. We design and study a new Hierarchical Attention Transformer-based architecture (HAT) that outperforms standard Transformers on several sequence to sequence tasks. Furthermore, our model achieves state-of-the-art ROUGE scores on four summarization tasks, including PubMed, arXiv, CNN/DM, SAMSum, and AMI. Our model outperforms document-level machine translation baseline on the WMT20 English to German translation task. We investigate what the hierarchical layers learn by visualizing the hierarchical encoder-decoder attention. Finally, we study hierarchical learning on encoder-only pre-training and analyze its performance on classification tasks.

Benchmarks

BenchmarkMethodologyMetrics
document-summarization-on-cnn-daily-mailHAT-BART
ROUGE-1: 44.48
ROUGE-2: 21.31
ROUGE-L: 41.52
reading-comprehension-on-raceHAT (Encoder)
Accuracy: 67.3
text-summarization-on-amiHAT-CNNDM
ROUGE-1: 52.27
ROUGE-2: 20.15
ROUGE-L: 50.57
text-summarization-on-arxivHAT-BART
ROUGE-1: 46.74
ROUGE-2: 19.19
ROUGE-L: 42.2
text-summarization-on-pubmed-1HAT-BART
ROUGE-1: 48.25
ROUGE-2: 21.35
ROUGE-L: 36.69
text-summarization-on-samsum-corpusHAT-CNNDM
ROUGE-1: 53.01
ROUGE-2: 28.27
text-summarization-on-samsum-corpusHAT-CNNDM RL
ROUGE-L: 48.84
text-summarization-on-x-sumHAT-BART
ROUGE-1: 45.92
ROUGE-2: 22.79

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Hierarchical Learning for Generation with Long Source Sequences | Papers | HyperAI