Command Palette
Search for a command to run...
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction
Yuhao Zhang; Peng Qi; Christopher D. Manning

Abstract
Dependency trees help relation extraction models capture long-range relations between words. However, existing dependency-based models either neglect crucial information (e.g., negation) by pruning the dependency trees too aggressively, or are computationally inefficient because it is difficult to parallelize over different tree structures. We propose an extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel. To incorporate relevant information while maximally removing irrelevant content, we further apply a novel pruning strategy to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. The resulting model achieves state-of-the-art performance on the large-scale TACRED dataset, outperforming existing sequence and dependency-based neural models. We also show through detailed analysis that this model has complementary strengths to sequence models, and combining them further improves the state of the art.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| relation-classification-on-tacred-1 | C-GCN | F1: 66.4 |
| relation-extraction-on-re-tacred | C-GCN | F1: 80.3 |
| relation-extraction-on-tacred | C-GCN | F1: 66.4 |
| relation-extraction-on-tacred | C-GCN + PA-LSTM | F1: 68.2 |
| relation-extraction-on-tacred | GCN + PA-LSTM | F1: 67.1 |
| relation-extraction-on-tacred | GCN | F1: 64.0 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.