HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation

Qingyu Tan Ruidan He Lidong Bing Hwee Tou Ng

Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation

Abstract

Document-level Relation Extraction (DocRE) is a more challenging task compared to its sentence-level counterpart. It aims to extract relations from multiple sentences at once. In this paper, we propose a semi-supervised framework for DocRE with three novel components. Firstly, we use an axial attention module for learning the interdependency among entity-pairs, which improves the performance on two-hop relations. Secondly, we propose an adaptive focal loss to tackle the class imbalance problem of DocRE. Lastly, we use knowledge distillation to overcome the differences between human annotated data and distantly supervised data. We conducted experiments on two DocRE datasets. Our model consistently outperforms strong baselines and its performance exceeds the previous SOTA by 1.36 F1 and 1.46 Ign_F1 score on the DocRED leaderboard. Our code and data will be released at https://github.com/tonytan48/KD-DocRE.

Code Repositories

tonytan48/kd-docre
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
relation-extraction-on-docredKD-Rb-l
F1: 67.28
Ign F1: 65.24
relation-extraction-on-redocredKD-DocRE
F1: 78.28
Ign F1: 77.60

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation | Papers | HyperAI