Command Palette
Search for a command to run...
Liang Yao; Chengsheng Mao; Yuan Luo

Abstract
Knowledge graphs are important resources for many artificial intelligence tasks but often suffer from incompleteness. In this work, we propose to use pre-trained language models for knowledge graph completion. We treat triples in knowledge graphs as textual sequences and propose a novel framework named Knowledge Graph Bidirectional Encoder Representations from Transformer (KG-BERT) to model these triples. Our method takes entity and relation descriptions of a triple as input and computes scoring function of the triple with the KG-BERT language model. Experimental results on multiple benchmark knowledge graphs show that our method can achieve state-of-the-art performance in triple classification, link prediction and relation prediction tasks.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| link-prediction-on-fb15k-237 | KG-BERT | Hits@10: 0.42 MR: 153 |
| link-prediction-on-umls | KG-BERT | Hits@10: 0.990 MR: 1.47 |
| link-prediction-on-wn18rr | KG-BERT | Hits@10: 0.524 MR: 97 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.