HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Graph Pre-training for AMR Parsing and Generation

Xuefeng Bai Yulong Chen Yue Zhang

Graph Pre-training for AMR Parsing and Generation

Abstract

Abstract meaning representation (AMR) highlights the core semantic information of text in a graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of AMR parsing and AMR-to-text generation, respectively. However, PLMs are typically pre-trained on textual data, thus are sub-optimal for modeling structural knowledge. To this end, we investigate graph self-supervised training to improve the structure awareness of PLMs over AMR graphs. In particular, we introduce two graph auto-encoding strategies for graph-to-graph pre-training and four tasks to integrate text and graph information during pre-training. We further design a unified framework to bridge the gap between pre-training and fine-tuning tasks. Experiments on both AMR parsing and AMR-to-text generation show the superiority of our model. To our knowledge, we are the first to consider pre-training on semantic graphs.

Code Repositories

goodbai-nlp/amrbart
pytorch
Mentioned in GitHub
muyeby/amrbart
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
amr-parsing-on-bioAMRBART large
Smatch: 63.2
amr-parsing-on-ldc2017t10AMRBART large
Smatch: 85.4
amr-parsing-on-ldc2020t02AMRBART large
Smatch: 84.2
amr-parsing-on-new3AMRBART large
Smatch: 76.9
amr-parsing-on-the-little-princeAMRBART large
Smatch: 79.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Graph Pre-training for AMR Parsing and Generation | Papers | HyperAI