HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Transformers Meet Directed Graphs

Simon Geisler; Yujia Li; Daniel Mankowitz; Ali Taylan Cemgil; Stephan Günnemann; Cosmin Paduraru

Transformers Meet Directed Graphs

Abstract

Transformers were originally proposed as a sequence-to-sequence model for text but have become vital for a wide range of modalities, including images, audio, video, and undirected graphs. However, transformers for directed graphs are a surprisingly underexplored topic, despite their applicability to ubiquitous domains, including source code and logic circuits. In this work, we propose two direction- and structure-aware positional encodings for directed graphs: (1) the eigenvectors of the Magnetic Laplacian - a direction-aware generalization of the combinatorial Laplacian; (2) directional random walk encodings. Empirically, we show that the extra directionality information is useful in various downstream tasks, including correctness testing of sorting networks and source code understanding. Together with a data-flow-centric graph construction, our model outperforms the prior state of the art on the Open Graph Benchmark Code2 relatively by 14.7%.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
graph-property-prediction-on-ogbg-code2SAT++ with Magnetic Laplacian
Ext. data: No
Number of params: 14378069
Test F1 score: 0.2222 ± 0.0010
Validation F1 score: 0.2044 ± 0.0020

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Transformers Meet Directed Graphs | Papers | HyperAI