HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Hie-BART: Document Summarization with Hierarchical BART

{Takashi Ninomiya Akihiro Tamura Kazuki Akiyama}

Hie-BART: Document Summarization with Hierarchical BART

Abstract

This paper proposes a new abstractive document summarization model, hierarchical BART (Hie-BART), which captures hierarchical structures of a document (i.e., sentence-word structures) in the BART model. Although the existing BART model has achieved a state-of-the-art performance on document summarization tasks, the model does not have the interactions between sentence-level information and word-level information. In machine translation tasks, the performance of neural machine translation models has been improved by incorporating multi-granularity self-attention (MG-SA), which captures the relationships between words and phrases. Inspired by the previous work, the proposed Hie-BART model incorporates MG-SA into the encoder of the BART model for capturing sentence-word structures. Evaluations on the CNN/Daily Mail dataset show that the proposed Hie-BART model outperforms some strong baselines and improves the performance of a non-hierarchical BART model (+0.23 ROUGE-L).

Benchmarks

BenchmarkMethodologyMetrics
document-summarization-on-cnn-daily-mailHie-BART
ROUGE-1: 44.35
ROUGE-2: 21.37
ROUGE-L: 41.05

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Hie-BART: Document Summarization with Hierarchical BART | Papers | HyperAI