HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

KG-BERT: BERT for Knowledge Graph Completion

Liang Yao; Chengsheng Mao; Yuan Luo

KG-BERT: BERT for Knowledge Graph Completion

Abstract

Knowledge graphs are important resources for many artificial intelligence tasks but often suffer from incompleteness. In this work, we propose to use pre-trained language models for knowledge graph completion. We treat triples in knowledge graphs as textual sequences and propose a novel framework named Knowledge Graph Bidirectional Encoder Representations from Transformer (KG-BERT) to model these triples. Our method takes entity and relation descriptions of a triple as input and computes scoring function of the triple with the KG-BERT language model. Experimental results on multiple benchmark knowledge graphs show that our method can achieve state-of-the-art performance in triple classification, link prediction and relation prediction tasks.

Code Repositories

gychant/CSKMTermDefn
pytorch
Mentioned in GitHub
ManasRMohanty/DS5500-capstone
pytorch
Mentioned in GitHub
yao8839836/kg-bert
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
link-prediction-on-fb15k-237KG-BERT
Hits@10: 0.42
MR: 153
link-prediction-on-umlsKG-BERT
Hits@10: 0.990
MR: 1.47
link-prediction-on-wn18rrKG-BERT
Hits@10: 0.524
MR: 97

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
KG-BERT: BERT for Knowledge Graph Completion | Papers | HyperAI