Command Palette
Search for a command to run...
Mind the Gap! Injecting Commonsense Knowledge for Abstractive Dialogue Summarization
Seungone Kim Se June Joo Hyungjoo Chae Chaehyeong Kim Seung-won Hwang Jinyoung Yeo

Abstract
In this paper, we propose to leverage the unique characteristics of dialogues sharing commonsense knowledge across participants, to resolve the difficulties in summarizing them. We present SICK, a framework that uses commonsense inferences as additional context. Compared to previous work that solely relies on the input dialogue, SICK uses an external knowledge model to generate a rich set of commonsense inferences and selects the most probable one with a similarity-based selection method. Built upon SICK, SICK++ utilizes commonsense as supervision, where the task of generating commonsense inferences is added upon summarizing the dialogue in a multi-task learning setting. Experimental results show that with injected commonsense knowledge, our framework generates more informative and consistent summaries than existing methods.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| text-summarization-on-dialogsum | SICK | BertScore: 71.30 Rouge1: 46.26 Rouge2: 20.95 RougeL: 41.05 |
| text-summarization-on-samsum-corpus | SICK | BertScoreF1: 71.92 ROUGE-1: 53.73 ROUGE-2: 28.81 ROUGE-L: 49.5 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.