HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

GRPE: Relative Positional Encoding for Graph Transformer

Wonpyo Park Woonggi Chang Donggeon Lee Juntae Kim Seung-won Hwang

GRPE: Relative Positional Encoding for Graph Transformer

Abstract

We propose a novel positional encoding for learning graph on Transformer architecture. Existing approaches either linearize a graph to encode absolute position in the sequence of nodes, or encode relative position with another node using bias terms. The former loses preciseness of relative position from linearization, while the latter loses a tight integration of node-edge and node-topology interaction. To overcome the weakness of the previous approaches, our method encodes a graph without linearization and considers both node-topology and node-edge interaction. We name our method Graph Relative Positional Encoding dedicated to graph representation learning. Experiments conducted on various graph datasets show that the proposed method outperforms previous approaches significantly. Our code is publicly available at https://github.com/lenscloth/GRPE.

Code Repositories

lenscloth/grpe
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
graph-regression-on-pcqm4mv2-lscGRPE-Large
Test MAE: 0.0876
Validation MAE: 0.0867

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
GRPE: Relative Positional Encoding for Graph Transformer | Papers | HyperAI