HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Unleashing the Power of Neural Discourse Parsers -- A Context and Structure Aware Approach Using Large Scale Pretraining

Grigorii Guz Patrick Huber Giuseppe Carenini

Unleashing the Power of Neural Discourse Parsers -- A Context and Structure Aware Approach Using Large Scale Pretraining

Abstract

RST-based discourse parsing is an important NLP task with numerous downstream applications, such as summarization, machine translation and opinion mining. In this paper, we demonstrate a simple, yet highly accurate discourse parser, incorporating recent contextual language models. Our parser establishes the new state-of-the-art (SOTA) performance for predicting structure and nuclearity on two key RST datasets, RST-DT and Instr-DT. We further demonstrate that pretraining our parser on the recently available large-scale "silver-standard" discourse treebank MEGA-DT provides even larger performance benefits, suggesting a novel and promising research direction in the field of discourse analysis.

Benchmarks

BenchmarkMethodologyMetrics
discourse-parsing-on-instructional-dt-instrGuz et al. (2020)
Standard Parseval (Nuclearity): 44.41
Standard Parseval (Span): 64.55
discourse-parsing-on-instructional-dt-instrGuz et al. (2020) (pretrained)
Standard Parseval (Nuclearity): 46.59
Standard Parseval (Span): 65.41
discourse-parsing-on-rst-dtGuz et al. (2020)
Standard Parseval (Nuclearity): 61.38
Standard Parseval (Span): 72.43
discourse-parsing-on-rst-dtGuz et al. (2020) (pretrained)
Standard Parseval (Nuclearity): 61.86
Standard Parseval (Span): 72.94

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Unleashing the Power of Neural Discourse Parsers -- A Context and Structure Aware Approach Using Large Scale Pretraining | Papers | HyperAI