Command Palette
Search for a command to run...
Span-based Joint Entity and Relation Extraction with Transformer Pre-training
Markus Eberts Adrian Ulges

Abstract
We introduce SpERT, an attention model for span-based joint entity and relation extraction. Our key contribution is a light-weight reasoning on BERT embeddings, which features entity recognition and filtering, as well as relation classification with a localized, marker-free context representation. The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass. These aspects facilitate a search over all spans in the sentence. In ablation studies, we demonstrate the benefits of pre-training, strong negative sampling and localized context. Our model outperforms prior work by up to 2.6% F1 score on several datasets for joint entity and relation extraction.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| joint-entity-and-relation-extraction-on | SpERT | Cross Sentence: No Entity F1: 70.33 |
| joint-entity-and-relation-extraction-on | SpERT (with overlap) | Cross Sentence: No Entity F1: 70.3 Relation F1: 50.84 |
| named-entity-recognition-ner-on-scierc | SpERT | F1: 70.33 |
| relation-extraction-on-ade-corpus | SpERT (without overlap) | NER Macro F1: 89.25 RE+ Macro F1: 79.24 |
| relation-extraction-on-ade-corpus | SpERT (with overlap) | NER Macro F1: 89.28 RE+ Macro F1: 78.84 |
| relation-extraction-on-conll04 | SpERT | NER Macro F1: 86.25 NER Micro F1: 88.94 RE+ Macro F1 : 72.87 RE+ Micro F1: 71.47 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.