HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Structure-Aware Transformer for Graph Representation Learning

Dexiong Chen Leslie O&#39 Bray Karsten Borgwardt

Structure-Aware Transformer for Graph Representation Learning

Abstract

The Transformer architecture has gained growing attention in graph representation learning recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by avoiding their strict structural inductive biases and instead only encoding the graph structure via positional encoding. Here, we show that the node representations generated by the Transformer with positional encoding do not necessarily capture structural similarity between them. To address this issue, we propose the Structure-Aware Transformer, a class of simple and flexible graph Transformers built upon a new self-attention mechanism. This new self-attention incorporates structural information into the original self-attention by extracting a subgraph representation rooted at each node before computing the attention. We propose several methods for automatically generating the subgraph representation and show theoretically that the resulting representations are at least as expressive as the subgraph representations. Empirically, our method achieves state-of-the-art performance on five graph prediction benchmarks. Our structure-aware framework can leverage any existing GNN to extract the subgraph representation, and we show that it systematically improves performance relative to the base GNN model, successfully combining the advantages of GNNs and Transformers. Our code is available at https://github.com/BorgwardtLab/SAT.

Code Repositories

BorgwardtLab/SAT
pytorch
Mentioned in GitHub
borgwardtlab/sat
Official
pytorch
Mentioned in GitHub
borgwardtlab/pst
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
emotion-recognition-in-conversation-onSAMGN
Weighted-F1: 71.11
graph-property-prediction-on-ogbg-code2SAT
Ext. data: No
Number of params: 15734000
Test F1 score: 0.1937 ± 0.0028
Validation F1 score: 0.1773 ± 0.0023

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Structure-Aware Transformer for Graph Representation Learning | Papers | HyperAI