Command Palette
Search for a command to run...
MoCoSA: Momentum Contrast for Knowledge Graph Completion with Structure-Augmented Pre-trained Language Models
Jiabang He; Liu Jia; Lei Wang; Xiyao Li; Xing Xu

Abstract
Knowledge Graph Completion (KGC) aims to conduct reasoning on the facts within knowledge graphs and automatically infer missing links. Existing methods can mainly be categorized into structure-based or description-based. On the one hand, structure-based methods effectively represent relational facts in knowledge graphs using entity embeddings. However, they struggle with semantically rich real-world entities due to limited structural information and fail to generalize to unseen entities. On the other hand, description-based methods leverage pre-trained language models (PLMs) to understand textual information. They exhibit strong robustness towards unseen entities. However, they have difficulty with larger negative sampling and often lag behind structure-based methods. To address these issues, in this paper, we propose Momentum Contrast for knowledge graph completion with Structure-Augmented pre-trained language models (MoCoSA), which allows the PLM to perceive the structural information by the adaptable structure encoder. To improve learning efficiency, we proposed momentum hard negative and intra-relation negative sampling. Experimental results demonstrate that our approach achieves state-of-the-art performance in terms of mean reciprocal rank (MRR), with improvements of 2.5% on WN18RR and 21% on OpenBG500.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| link-prediction-on-fb15k-237 | MoCoSA | Hits@1: 0.292 Hits@10: 0.578 Hits@3: 0.42 MRR: 0.387 |
| link-prediction-on-openbg500 | MoCoSA | Hits@1: 0.531 Hits@10: 0.83 Hits@3: 0.711 MRR: 0.634 |
| link-prediction-on-wn18rr | MoCoSA | Hits@1: 0.624 Hits@10: 0.82 Hits@3: 0.737 MRR: 0.696 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.