HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Selective Encoding for Abstractive Sentence Summarization

Qingyu Zhou; Nan Yang; Furu Wei; Ming Zhou

Selective Encoding for Abstractive Sentence Summarization

Abstract

We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization. It consists of a sentence encoder, a selective gate network, and an attention equipped decoder. The sentence encoder and decoder are built with recurrent neural networks. The selective gate network constructs a second level sentence representation by controlling the information flow from encoder to decoder. The second level representation is tailored for sentence summarization task, which leads to better performance. We evaluate our model on the English Gigaword, DUC 2004 and MSR abstractive sentence summarization datasets. The experimental results show that the proposed selective encoding model outperforms the state-of-the-art baseline models.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
text-summarization-on-duc-2004-task-1SEASS
ROUGE-1: 29.21
ROUGE-2: 9.56
ROUGE-L: 25.51
text-summarization-on-gigawordSEASS
ROUGE-1: 36.15
ROUGE-2: 17.54
ROUGE-L: 33.63

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Selective Encoding for Abstractive Sentence Summarization | Papers | HyperAI