HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Span-based Joint Entity and Relation Extraction with Transformer Pre-training

Markus Eberts Adrian Ulges

Span-based Joint Entity and Relation Extraction with Transformer Pre-training

Abstract

We introduce SpERT, an attention model for span-based joint entity and relation extraction. Our key contribution is a light-weight reasoning on BERT embeddings, which features entity recognition and filtering, as well as relation classification with a localized, marker-free context representation. The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass. These aspects facilitate a search over all spans in the sentence. In ablation studies, we demonstrate the benefits of pre-training, strong negative sampling and localized context. Our model outperforms prior work by up to 2.6% F1 score on several datasets for joint entity and relation extraction.

Code Repositories

markus-eberts/spert
Official
pytorch
Mentioned in GitHub
lavis-nlp/spert
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
joint-entity-and-relation-extraction-onSpERT
Cross Sentence: No
Entity F1: 70.33
joint-entity-and-relation-extraction-onSpERT (with overlap)
Cross Sentence: No
Entity F1: 70.3
Relation F1: 50.84
named-entity-recognition-ner-on-sciercSpERT
F1: 70.33
relation-extraction-on-ade-corpusSpERT (without overlap)
NER Macro F1: 89.25
RE+ Macro F1: 79.24
relation-extraction-on-ade-corpusSpERT (with overlap)
NER Macro F1: 89.28
RE+ Macro F1: 78.84
relation-extraction-on-conll04SpERT
NER Macro F1: 86.25
NER Micro F1: 88.94
RE+ Macro F1 : 72.87
RE+ Micro F1: 71.47

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Span-based Joint Entity and Relation Extraction with Transformer Pre-training | Papers | HyperAI