HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Levi Graph AMR Parser using Heterogeneous Attention

Han He Jinho D. Choi

Levi Graph AMR Parser using Heterogeneous Attention

Abstract

Coupled with biaffine decoders, transformers have been effectively adapted to text-to-graph transduction and achieved state-of-the-art performance on AMR parsing. Many prior works, however, rely on the biaffine decoder for either or both arc and label predictions although most features used by the decoder may be learned by the transformer already. This paper presents a novel approach to AMR parsing by combining heterogeneous data (tokens, concepts, labels) as one input to a transformer to learn attention, and use only attention matrices from the transformer to predict all elements in AMR graphs (concepts, arcs, labels). Although our models use significantly fewer parameters than the previous state-of-the-art graph parser, they show similar or better accuracy on AMR 2.0 and 3.0.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
amr-parsing-on-ldc2017t10ND+AD+LV
Smatch: 80

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Levi Graph AMR Parser using Heterogeneous Attention | Papers | HyperAI