HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Do We Need Anisotropic Graph Neural Networks?

Shyam A. Tailor Felix L. Opolka Pietro Liò Nicholas D. Lane

Do We Need Anisotropic Graph Neural Networks?

Abstract

Common wisdom in the graph neural network (GNN) community dictates that anisotropic models -- in which messages sent between nodes are a function of both the source and target node -- are required to achieve state-of-the-art performance. Benchmarks to date have demonstrated that these models perform better than comparable isotropic models -- where messages are a function of the source node only. In this work we provide empirical evidence challenging this narrative: we propose an isotropic GNN, which we call Efficient Graph Convolution (EGC), that consistently outperforms comparable anisotropic models, including the popular GAT or PNA architectures by using spatially-varying adaptive filters. In addition to raising important questions for the GNN community, our work has significant real-world implications for efficiency. EGC achieves higher model accuracy, with lower memory consumption and latency, along with characteristics suited to accelerator implementation, while being a drop-in replacement for existing architectures. As an isotropic model, it requires memory proportional to the number of vertices in the graph ($\mathcal{O}(V)$); in contrast, anisotropic models require memory proportional to the number of edges ($\mathcal{O}(E)$). We demonstrate that EGC outperforms existing approaches across 6 large and diverse benchmark datasets, and conclude by discussing questions that our work raise for the community going forward. Code and pretrained models for our experiments are provided at https://github.com/shyam196/egc.

Code Repositories

pyg-team/pytorch_geometric
Official
pytorch
Mentioned in GitHub
shyam196/egc
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
graph-property-prediction-on-ogbg-code2EGC-M (No Edge Features)
Ext. data: No
Number of params: 10986002
Test F1 score: 0.1595 ± 0.0019
Validation F1 score: 0.1464 ± 0.0021
graph-property-prediction-on-ogbg-code2MPNN-Max (No Edge Features)
Ext. data: No
Number of params: 10971506
Test F1 score: 0.1552 ± 0.0022
Validation F1 score: 0.1441 ± 0.0016
graph-property-prediction-on-ogbg-code2PNA (No Edge Features)
Ext. data: No
Number of params: 10992050
Test F1 score: 0.1570 ± 0.0032
Validation F1 score: 0.1453 ± 0.0025
graph-property-prediction-on-ogbg-code2EGC-S (No Edge Features)
Ext. data: No
Number of params: 11156530
Test F1 score: 0.1528 ± 0.0025
Validation F1 score: 0.1427 ± 0.0020
graph-property-prediction-on-ogbg-molhivEGC-S (No Edge Features)
Ext. data: No
Number of params: 317013
Test ROC-AUC: 0.7721 ± 0.0110
Validation ROC-AUC: 0.8366 ± 0.0074
graph-property-prediction-on-ogbg-molhivEGC-M (No Edge Features)
Ext. data: No
Number of params: 317265
Test ROC-AUC: 0.7818 ± 0.0153
Validation ROC-AUC: 0.8396 ± 0.0097

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Do We Need Anisotropic Graph Neural Networks? | Papers | HyperAI