HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Multi-Document Summarization withDeterminantal Point Process Attention

{Mirella Lapata Laura Perez-Beltrachini}

Abstract

The ability to convey relevant and diverse information is critical in multi-documentsummarization and yet remains elusive for neural seq-to-seq models whose outputs are of-ten redundant and fail to correctly cover important details. In this work, we propose anattention mechanism which encourages greater focus onrelevanceanddiversity. Attentionweights are computed based on (proportional) probabilities given by Determinantal PointProcesses (DPPs) defined on the set of content units to be summarized. DPPs have beensuccessfully used in extractive summarisation, here we use them to select relevant anddiverse content for neural abstractive summarisation. We integrate DPP-based attentionwith various seq-to-seq architectures ranging from CNNs to LSTMs, and Transformers.Experimental evaluation shows that our attention mechanism consistently improves sum-marization and delivers performance comparable with the state-of-the-art on the MultiNewsdataset.

Benchmarks

BenchmarkMethodologyMetrics
multi-document-summarization-on-multi-newsCTF+DPP
ROUGE-1: 45.84
ROUGE-2: 15.94
ROUGE-SU4: 19.19

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Multi-Document Summarization withDeterminantal Point Process Attention | Papers | HyperAI