Command Palette
Search for a command to run...
Maximilian Nickel; Lorenzo Rosasco; Tomaso Poggio

Abstract
Learning embeddings of entities and relations is an efficient and versatile method to perform machine learning on relational data such as knowledge graphs. In this work, we propose holographic embeddings (HolE) to learn compositional vector space representations of entire knowledge graphs. The proposed method is related to holographic models of associative memory in that it employs circular correlation to create compositional representations. By using correlation as the compositional operator HolE can capture rich interactions but simultaneously remains efficient to compute, easy to train, and scalable to very large datasets. In extensive experiments we show that holographic embeddings are able to outperform state-of-the-art methods for link prediction in knowledge graphs and relational learning benchmark datasets.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| link-prediction-on-fb15k-1 | HolE | Hits@1: 0.402 Hits@10: 0.739 Hits@3: 0.613 MRR: 0.524 |
| link-prediction-on-wn18 | HolE | Hits@1: 0.930 Hits@10: 0.949 Hits@3: 0.945 MRR: 0.938 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.