HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Enriching Pre-trained Language Model with Entity Information for Relation Classification

Shanchan Wu; Yifan He

Enriching Pre-trained Language Model with Entity Information for Relation Classification

Abstract

Relation classification is an important NLP task to extract relations between entities. The state-of-the-art methods for relation classification are primarily based on Convolutional or Recurrent Neural Networks. Recently, the pre-trained BERT model achieves very successful results in many NLP classification / sequence labeling tasks. Relation classification differs from those tasks in that it relies on information of both the sentence and the two target entities. In this paper, we propose a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task. We locate the target entities and transfer the information through the pre-trained architecture and incorporate the corresponding encoding of the two entities. We achieve significant improvement over the state-of-the-art method on the SemEval-2010 task 8 relational dataset.

Code Repositories

mickeystroller/R-BERT
pytorch
Mentioned in GitHub
wang-h/bert-relation-classification
pytorch
Mentioned in GitHub
Valdegg/anlp_rbert
Mentioned in GitHub
monologg/R-BERT
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
relation-extraction-on-semeval-2010-task-8R-BERT
F1: 89.25
relation-extraction-on-tacredR-BERT
F1: 69.4

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp