HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Diffusing Graph Attention

Daniel Glickman Eran Yahav

Diffusing Graph Attention

Abstract

The dominant paradigm for machine learning on graphs uses Message Passing Graph Neural Networks (MP-GNNs), in which node representations are updated by aggregating information in their local neighborhood. Recently, there have been increasingly more attempts to adapt the Transformer architecture to graphs in an effort to solve some known limitations of MP-GNN. A challenging aspect of designing Graph Transformers is integrating the arbitrary graph structure into the architecture. We propose Graph Diffuser (GD) to address this challenge. GD learns to extract structural and positional relationships between distant nodes in the graph, which it then uses to direct the Transformer's attention and node representation. We demonstrate that existing GNNs and Graph Transformers struggle to capture long-range interactions and how Graph Diffuser does so while admitting intuitive visualizations. Experiments on eight benchmarks show Graph Diffuser to be a highly competitive model, outperforming the state-of-the-art in a diverse set of domains.

Benchmarks

BenchmarkMethodologyMetrics
graph-classification-on-peptides-funcGraph Diffuser
AP: 0.6651±0.0010
graph-regression-on-peptides-structGraph Diffuser
MAE: 0.2461±0.0010
link-prediction-on-pcqm-contactGraph Diffuser
Hits@1: 0.1369±0.0012
Hits@10: 0.8592±0.0007
Hits@3: 0.4053±0.0011
MRR: 0.3388±0.0011

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Diffusing Graph Attention | Papers | HyperAI