HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Deep Semantic Role Labeling with Self-Attention

Zhixing Tan; Mingxuan Wang; Jun Xie; Yidong Chen; Xiaodong Shi

Deep Semantic Role Labeling with Self-Attention

Abstract

Semantic Role Labeling (SRL) is believed to be a crucial step towards natural language understanding and has been widely studied. Recent years, end-to-end SRL with recurrent neural networks (RNN) has gained increasing attention. However, it remains a major challenge for RNNs to handle structural information and long range dependencies. In this paper, we present a simple and effective architecture for SRL which aims to address these problems. Our model is based on self-attention which can directly capture the relationships between two tokens regardless of their distance. Our single model achieves F$_1=83.4$ on the CoNLL-2005 shared task dataset and F$_1=82.7$ on the CoNLL-2012 shared task dataset, which outperforms the previous state-of-the-art results by $1.8$ and $1.0$ F$_1$ score respectively. Besides, our model is computationally efficient, and the parsing speed is 50K tokens per second on a single Titan X GPU.

Code Repositories

XMUNLP/Tagger
Official
tf
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
semantic-role-labeling-on-ontonotesTan et al.
F1: 82.7

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Deep Semantic Role Labeling with Self-Attention | Papers | HyperAI