HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Graph Transformers without Positional Encodings

Ayush Garg

Graph Transformers without Positional Encodings

Abstract

Recently, Transformers for graph representation learning have become increasingly popular, achieving state-of-the-art performance on a wide-variety of graph datasets, either alone or in combination with message-passing graph neural networks (MP-GNNs). Infusing graph inductive-biases in the innately structure-agnostic transformer architecture in the form of structural or positional encodings (PEs) is key to achieving these impressive results. However, designing such encodings is tricky and disparate attempts have been made to engineer such encodings including Laplacian eigenvectors, relative random-walk probabilities (RRWP), spatial encodings, centrality encodings, edge encodings etc. In this work, we argue that such encodings may not be required at all, provided the attention mechanism itself incorporates information about the graph structure. We introduce Eigenformer, a Graph Transformer employing a novel spectrum-aware attention mechanism cognizant of the Laplacian spectrum of the graph, and empirically show that it achieves performance competetive with SOTA Graph Transformers on a number of standard GNN benchmarks. Additionally, we theoretically prove that Eigenformer can express various graph structural connectivity matrices, which is particularly essential when learning over smaller graphs.

Benchmarks

BenchmarkMethodologyMetrics
graph-classification-on-cifar10-100kEIGENFORMER
Accuracy (%): 70.194
graph-classification-on-mnistEIGENFORMER
Accuracy: 98.362
graph-classification-on-peptides-funcEIGENFORMER
AP: 0.6414
graph-regression-on-peptides-structEIGENFORMER
MAE: 0.2599
graph-regression-on-zincEIGENFORMER
MAE: 0.077
node-classification-on-clusterEIGENFORMER
Accuracy: 77.456
node-classification-on-patternEIGENFORMER
Accuracy: 86.738

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Graph Transformers without Positional Encodings | Papers | HyperAI