HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

MLMLM: Link Prediction with Mean Likelihood Masked Language Model

Louis Clouatre Philippe Trempe Amal Zouaq Sarath Chandar

MLMLM: Link Prediction with Mean Likelihood Masked Language Model

Abstract

Knowledge Bases (KBs) are easy to query, verifiable, and interpretable. They however scale with man-hours and high-quality data. Masked Language Models (MLMs), such as BERT, scale with computing power as well as unstructured raw text data. The knowledge contained within those models is however not directly interpretable. We propose to perform link prediction with MLMs to address both the KBs scalability issues and the MLMs interpretability issues. To do that we introduce MLMLM, Mean Likelihood Masked Language Model, an approach comparing the mean likelihood of generating the different entities to perform link prediction in a tractable manner. We obtain State of the Art (SotA) results on the WN18RR dataset and the best non-entity-embedding based results on the FB15k-237 dataset. We also obtain convincing results on link prediction on previously unseen entities, making MLMLM a suitable approach to introducing new entities to a KB.

Benchmarks

BenchmarkMethodologyMetrics
link-prediction-on-fb15k-237MLMLM
Hits@1: 0.1871
Hits@10: 0.4026
Hits@3: 0.2820
MR: 411
MRR: 0.2591
link-prediction-on-wn18rrMLMLM
Hits@1: 0.4391
Hits@10: 0.611
Hits@3: 0.5418
MR: 1603
MRR: 0.5017

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
MLMLM: Link Prediction with Mean Likelihood Masked Language Model | Papers | HyperAI