Command Palette
Search for a command to run...
Ralph Abboud İsmail İlkan Ceylan Thomas Lukasiewicz Tommaso Salvatori

Abstract
Knowledge base completion (KBC) aims to automatically infer missing facts by exploiting information already present in a knowledge base (KB). A promising approach for KBC is to embed knowledge into latent spaces and make predictions from learned embeddings. However, existing embedding models are subject to at least one of the following limitations: (1) theoretical inexpressivity, (2) lack of support for prominent inference patterns (e.g., hierarchies), (3) lack of support for KBC over higher-arity relations, and (4) lack of support for incorporating logical rules. Here, we propose a spatio-translational embedding model, called BoxE, that simultaneously addresses all these limitations. BoxE embeds entities as points, and relations as a set of hyper-rectangles (or boxes), which spatially characterize basic logical properties. This seemingly simple abstraction yields a fully expressive model offering a natural encoding for many desired logical properties. BoxE can both capture and inject rules from rich classes of rule languages, going well beyond individual inference patterns. By design, BoxE naturally applies to higher-arity KBs. We conduct a detailed experimental analysis, and show that BoxE achieves state-of-the-art performance, both on benchmark knowledge graphs and on more general KBs, and we empirically show the power of integrating logical rules.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| link-prediction-on-fb-auto | BoxE | Hits@1: 0.814 Hits@10: 0.898 MRR: 0.844 |
| link-prediction-on-fb15k-237 | BoxE | Hits@1: 0.238 Hits@10: 0.538 MRR: 0.337 |
| link-prediction-on-jf17k | BoxE | Hit@1: 0.472 Hit@10: 0.722 MRR: 0.560 |
| link-prediction-on-yago3-10 | BoxE | Hits@1: 0.494 Hits@10: 0.699 MRR: 0.567 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.