HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Gradient Gating for Deep Multi-Rate Learning on Graphs

T. Konstantin Rusch Benjamin P. Chamberlain Michael W. Mahoney Michael M. Bronstein Siddhartha Mishra

Gradient Gating for Deep Multi-Rate Learning on Graphs

Abstract

We present Gradient Gating (G$^2$), a novel framework for improving the performance of Graph Neural Networks (GNNs). Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph. Local gradients are harnessed to further modulate message passing updates. Our framework flexibly allows one to use any basic GNN layer as a wrapper around which the multi-rate gradient gating mechanism is built. We rigorously prove that G$^2$ alleviates the oversmoothing problem and allows the design of deep GNNs. Empirical results are presented to demonstrate that the proposed framework achieves state-of-the-art performance on a variety of graph learning tasks, including on large-scale heterophilic graphs.

Code Repositories

tk-rusch/gradientgating
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
node-classification-on-arxiv-yearG^2-GraphSAGE
Accuracy: 63.30±1.84
node-classification-on-geniusG^2-GraphSAGE
Accuracy: 90.85±0.64

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Gradient Gating for Deep Multi-Rate Learning on Graphs | Papers | HyperAI