HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling

Alireza Mohammadshahi James Henderson

Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling

Abstract

Recent models have shown that incorporating syntactic knowledge into the semantic role labelling (SRL) task leads to a significant improvement. In this paper, we propose Syntax-aware Graph-to-Graph Transformer (SynG2G-Tr) model, which encodes the syntactic structure using a novel way to input graph relations as embeddings, directly into the self-attention mechanism of Transformer. This approach adds a soft bias towards attention patterns that follow the syntactic structure but also allows the model to use this information to learn alternative patterns. We evaluate our model on both span-based and dependency-based SRL datasets, and outperform previous alternative methods in both in-domain and out-of-domain settings, on CoNLL 2005 and CoNLL 2009 datasets.

Benchmarks

BenchmarkMethodologyMetrics
semantic-role-labeling-on-conll-2005Mohammadshahi and Henderson (2021)
F1: 88.93

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling | Papers | HyperAI