HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Relphormer: Relational Graph Transformer for Knowledge Graph Representations

Zhen Bi Siyuan Cheng Jing Chen Xiaozhuan Liang Feiyu Xiong Ningyu Zhang

Relphormer: Relational Graph Transformer for Knowledge Graph Representations

Abstract

Transformers have achieved remarkable performance in widespread fields, including natural language processing, computer vision and graph mining. However, vanilla Transformer architectures have not yielded promising improvements in the Knowledge Graph (KG) representations, where the translational distance paradigm dominates this area. Note that vanilla Transformer architectures struggle to capture the intrinsically heterogeneous structural and semantic information of knowledge graphs. To this end, we propose a new variant of Transformer for knowledge graph representations dubbed Relphormer. Specifically, we introduce Triple2Seq which can dynamically sample contextualized sub-graph sequences as the input to alleviate the heterogeneity issue. We propose a novel structure-enhanced self-attention mechanism to encode the relational information and keep the semantic information within entities and relations. Moreover, we utilize masked knowledge modeling for general knowledge graph representation learning, which can be applied to various KG-based tasks including knowledge graph completion, question answering, and recommendation. Experimental results on six datasets show that Relphormer can obtain better performance compared with baselines. Code is available in https://github.com/zjunlp/Relphormer.

Code Repositories

zjunlp/relphormer
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
link-prediction-on-fb15k-237Relphormer
Hits@1: 0.314
Hits@10: 0.481
MRR: 0.371
link-prediction-on-wn18rrRelphormer
Hits@1: 0.448
Hits@10: 0.591
MRR: 0.495

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Relphormer: Relational Graph Transformer for Knowledge Graph Representations | Papers | HyperAI