HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

A Graph-to-Sequence Model for AMR-to-Text Generation

Linfeng Song; Yue Zhang; Zhiguo Wang; Daniel Gildea

A Graph-to-Sequence Model for AMR-to-Text Generation

Abstract

The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph. The current state-of-the-art method uses a sequence-to-sequence model, leveraging LSTM for encoding a linearized AMR structure. Although being able to model non-local semantic information, a sequence LSTM can lose information from the AMR graph structure, and thus faces challenges with large graphs, which result in long sequences. We introduce a neural graph-to-sequence model, using a novel LSTM structure for directly encoding graph-level semantics. On a standard benchmark, our model shows superior results to existing methods in the literature.

Code Repositories

freesunshine0316/neural-graph-to-seq-mp
Official
tf
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
graph-to-sequence-on-ldc2015e86GRN
BLEU: 33.6
text-generation-on-ldc2016e25Graph2Seq
BLEU: 22

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
A Graph-to-Sequence Model for AMR-to-Text Generation | Papers | HyperAI