Command Palette
Search for a command to run...
Tim Dettmers; Pasquale Minervini; Pontus Stenetorp; Sebastian Riedel

Abstract
Link prediction for knowledge graphs is the task of predicting missing relationships between entities. Previous work on link prediction has focused on shallow, fast models which can scale to large knowledge graphs. However, these models learn less expressive features than deep, multi-layer models -- which potentially limits performance. In this work, we introduce ConvE, a multi-layer convolutional network model for link prediction, and report state-of-the-art results for several established datasets. We also show that the model is highly parameter efficient, yielding the same performance as DistMult and R-GCN with 8x and 17x fewer parameters. Analysis of our model suggests that it is particularly effective at modelling nodes with high indegree -- which are common in highly-connected, complex knowledge graphs such as Freebase and YAGO3. In addition, it has been noted that the WN18 and FB15k datasets suffer from test set leakage, due to inverse relations from the training set being present in the test set -- however, the extent of this issue has so far not been quantified. We find this problem to be severe: a simple rule-based model can achieve state-of-the-art results on both WN18 and FB15k. To ensure that models are evaluated on datasets where simply exploiting inverse relations cannot yield competitive results, we investigate and validate several commonly used datasets -- deriving robust variants where necessary. We then perform experiments on these robust datasets for our own and several previously proposed models and find that ConvE achieves state-of-the-art Mean Reciprocal Rank across most datasets.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| link-prediction-on-fb15k | ConvE | Hits@1: 0.558 Hits@10: 0.831 Hits@3: 0.723 MR: 51 MRR: 0.657 |
| link-prediction-on-fb15k | Inverse Model | Hits@1: 0.658 Hits@10: 0.660 Hits@3: 0.659 MR: 2501 MRR: 0.660 |
| link-prediction-on-fb15k-237 | Inverse Model | Hits@1: 0.007 Hits@10: 0.014 Hits@3: 0.011 MR: 7030 MRR: 0.010 |
| link-prediction-on-fb15k-237 | ConvE | Hits@1: 0.237 Hits@10: 0.501 Hits@3: 0.356 MRR: 0.325 |
| link-prediction-on-umls | ConvE | Hits@10: 0.990 MR: 1.51 |
| link-prediction-on-wn18 | ConvE | Hits@1: 0.935 Hits@10: 0.956 Hits@3: 0.946 MR: 374 MRR: 0.943 |
| link-prediction-on-wn18 | Inverse Model | Hits@1: 0.953 Hits@10: 0.964 Hits@3: 0.964 MR: 740 MRR: 0.963 |
| link-prediction-on-wn18rr | ConvE | Hits@1: 0.400 Hits@10: 0.520 Hits@3: 0.440 MRR: 0.430 |
| link-prediction-on-wn18rr | Inverse Model | Hits@1: 0.35 Hits@10: 0.35 Hits@3: 0.35 MR: 13526 MRR: 0.35 |
| link-prediction-on-yago3-10 | ConvE | Hits@10: 0.62 MRR: 0.44 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.