Command Palette
Search for a command to run...
Sanxing Chen Xiaodong Liu Jianfeng Gao Jian Jiao Ruofei Zhang Yangfeng Ji

Abstract
This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity's neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Experimental results show that HittER achieves new state-of-the-art results on multiple link prediction datasets. We additionally propose a simple approach to integrate HittER into BERT and demonstrate its effectiveness on two Freebase factoid question answering datasets.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| link-prediction-on-fb15k-237 | HittER | Hit@1: 0.279 Hit@10: 0.558 Hits@3: 0.409 MRR: 0.373 |
| link-prediction-on-wn18rr | HittER | Hits@1: 0.462 Hits@10: 0.584 Hits@3: 0.516 MRR: 0.503 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.