Command Palette
Search for a command to run...
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
Xiaozhi Wang; Tianyu Gao; Zhaocheng Zhu; Zhengyan Zhang; Zhiyuan Liu; Juanzi Li; Jian Tang

Abstract
Pre-trained language representation models (PLMs) cannot well capture factual knowledge from text. In contrast, knowledge embedding (KE) methods can effectively represent the relational facts in knowledge graphs (KGs) with informative entity embeddings, but conventional KE models cannot take full advantage of the abundant textual information. In this paper, we propose a unified model for Knowledge Embedding and Pre-trained LanguagE Representation (KEPLER), which can not only better integrate factual knowledge into PLMs but also produce effective text-enhanced KE with the strong PLMs. In KEPLER, we encode textual entity descriptions with a PLM as their embeddings, and then jointly optimize the KE and language modeling objectives. Experimental results show that KEPLER achieves state-of-the-art performances on various NLP tasks, and also works remarkably well as an inductive KE model on KG link prediction. Furthermore, for pre-training and evaluating KEPLER, we construct Wikidata5M, a large-scale KG dataset with aligned entity descriptions, and benchmark state-of-the-art KE methods on it. It shall serve as a new KE benchmark and facilitate the research on large KG, inductive KE, and KG with text. The source code can be obtained from https://github.com/THU-KEG/KEPLER.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| inductive-knowledge-graph-completion-on-1 | KEPLER-Wiki-rel | Hits@1: 0.222 Hits@10: 0.73 Hits@3: 0.514 MRR: 0.402 |
| link-prediction-on-wikidata5m | DistMult | Hits@1: 0.208 Hits@10: 0.334 Hits@3: 0.278 MRR: 0.253 |
| link-prediction-on-wikidata5m | SimplE | Hits@1: 0.252 Hits@10: 0.377 Hits@3: 0.317 MRR: 0.296 |
| link-prediction-on-wikidata5m | ComplEx | Hits@1: 0.228 Hits@10: 0.373 Hits@3: 0.310 MRR: 0.281 |
| link-prediction-on-wikidata5m | TransE | Hits@1: 0.17 Hits@10: 0.392 Hits@3: 0.311 MRR: 0.253 |
| link-prediction-on-wikidata5m | RotatE | Hits@1: 0.234 Hits@10: 0.39 Hits@3: 0.322 MRR: 0.29 |
| link-prediction-on-wikidata5m | KEPLER-Wiki-rel | Hits@1: 0.173 Hits@10: 0.277 Hits@3: 0.224 MRR: 0.210 |
| relation-classification-on-tacred-1 | KEPLER | F1: 71.7 |
| relation-extraction-on-tacred | KEPLER | F1: 71.7 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.