HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Structure-Augmented Text Representation Learning for Efficient Knowledge Graph Completion

Bo Wang Tao Shen Guodong Long Tianyi Zhou Yi Chang

Structure-Augmented Text Representation Learning for Efficient Knowledge Graph Completion

Abstract

Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks, but these graphs are usually incomplete, urging auto-completion of them. Prevalent graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings and capturing their triple-level relationship with spatial distance. However, they are hardly generalizable to the elements never visited in training and are intrinsically vulnerable to graph incompleteness. In contrast, textual encoding approaches, e.g., KG-BERT, resort to graph triple's text and triple-level contextualized representations. They are generalizable enough and robust to the incompleteness, especially when coupled with pre-trained encoders. But two major drawbacks limit the performance: (1) high overheads due to the costly scoring of all possible triples in inference, and (2) a lack of structured knowledge in the textual encoder. In this paper, we follow the textual encoding paradigm and aim to alleviate its drawbacks by augmenting it with graph embedding techniques -- a complementary hybrid of both paradigms. Specifically, we partition each triple into two asymmetric parts as in translation-based graph embedding approach, and encode both parts into contextualized representations by a Siamese-style textual encoder. Built upon the representations, our model employs both deterministic classifier and spatial measurement for representation and structure learning respectively. Moreover, we develop a self-adaptive ensemble scheme to further improve the performance by incorporating triple scores from an existing graph embedding model. In experiments, we achieve state-of-the-art performance on three benchmarks and a zero-shot dataset for link prediction, with highlights of inference costs reduced by 1-2 orders of magnitude compared to a textual encoding method.

Code Repositories

wangbo9719/StAR_KGC
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
link-prediction-on-fb15k-237StAR
Hits@1: 0.266
Hits@10: 0.562
Hits@3: 0.404
MR: 117
MRR: 0.365
link-prediction-on-umlsStAR
Hits@10: 0.991
MR: 1.49
link-prediction-on-wn18rrStAR(Self-Adp)
Hits@1: 0.243
Hits@10: 0.709
Hits@3: 0.491
MR: 51
MRR: 0.401

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Structure-Augmented Text Representation Learning for Efficient Knowledge Graph Completion | Papers | HyperAI