HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

A Frustratingly Easy Approach for Entity and Relation Extraction

Zexuan Zhong Danqi Chen

A Frustratingly Easy Approach for Entity and Relation Extraction

Abstract

End-to-end relation extraction aims to identify named entities and extract relations between them. Most recent work models these two subtasks jointly, either by casting them in one structured prediction framework, or performing multi-task learning through shared representations. In this work, we present a simple pipelined approach for entity and relation extraction, and establish the new state-of-the-art on standard benchmarks (ACE04, ACE05 and SciERC), obtaining a 1.7%-2.8% absolute improvement in relation F1 over previous joint models with the same pre-trained encoders. Our approach essentially builds on two independent encoders and merely uses the entity model to construct the input for the relation model. Through a series of careful examinations, we validate the importance of learning distinct contextual representations for entities and relations, fusing entity information early in the relation model, and incorporating global context. Finally, we also present an efficient approximation to our approach which requires only one pass of both entity and relation encoders at inference time, achieving an 8-16$\times$ speedup with a slight reduction in accuracy.

Code Repositories

princeton-nlp/PURE
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
joint-entity-and-relation-extraction-onOurs: cross-sentence
Cross Sentence: Yes
Entity F1: 68.9
RE+ Micro F1: 36.7
Relation F1: 50.1
joint-entity-and-relation-extraction-on-aceOurs: cross-sentence ALB
Relation F1: 62.2
named-entity-recognition-ner-on-sciercOurs: cross-sentence
F1: 68.2
named-entity-recognition-on-ace-2004Ours: cross-sentence ALB
F1: 90.3
Multi-Task Supervision: y
named-entity-recognition-on-ace-2005Ours: cross-sentence ALB
F1: 90.9
relation-extraction-on-ace-2004Ours: cross-sentence ALB
Cross Sentence: Yes
NER Micro F1: 90.3
RE Micro F1: 66.1
RE+ Micro F1: 62.2
relation-extraction-on-ace-2005Ours: cross-sentence ALB
Cross Sentence: Yes
NER Micro F1: 90.9
RE Micro F1: 69.4
Sentence Encoder: ALBERT

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp