HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

Ramesh Nallapati; Bowen Zhou; Cicero Nogueira dos santos; Caglar Gulcehre; Bing Xiang

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

Abstract

In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora. We propose several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentence-to-word structure, and emitting words that are rare or unseen at training time. Our work shows that many of our proposed models contribute to further improvement in performance. We also propose a new dataset consisting of multi-sentence summaries, and establish performance benchmarks for further research.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
abstractive-text-summarization-on-cnn-dailyLEAD-3
ROUGE-1: 40.42
ROUGE-2: 17.62
ROUGE-L: 36.67
text-summarization-on-cnn-daily-mail-2words-lvt2k-temp-att
ROUGE-1: 35.46
ROUGE-2: 13.30
ROUGE-L: 32.65
text-summarization-on-duc-2004-task-1words-lvt5k-1sent
ROUGE-1: 28.61
ROUGE-2: 9.42
ROUGE-L: 25.24
text-summarization-on-gigawordwords-lvt5k-1sent
ROUGE-1: 36.4
ROUGE-2: 17.7
ROUGE-L: 33.71

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond | Papers | HyperAI