Command Palette
Search for a command to run...
Benjamin Gutteridge Xiaowen Dong Michael Bronstein Francesco Di Giovanni

Abstract
Message passing neural networks (MPNNs) have been shown to suffer from the phenomenon of over-squashing that causes poor performance for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node's immediate neighbours. Rewiring approaches attempting to make graphs 'more connected', and supposedly better suited to long-range tasks, often lose the inductive bias provided by distance on the graph since they make distant nodes communicate instantly at every layer. In this paper we propose a framework, applicable to any MPNN architecture, that performs a layer-dependent rewiring to ensure gradual densification of the graph. We also propose a delay mechanism that permits skip connections between nodes depending on the layer and their mutual distance. We validate our approach on several long-range tasks and show that it outperforms graph Transformers and multi-hop MPNNs.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| graph-classification-on-peptides-func | DRew-GCN+LapPE | AP: 0.7150±0.0044 |
| graph-regression-on-peptides-struct | DRew-GCN+LapPE | MAE: 0.2536±0.0015 |
| link-prediction-on-pcqm-contact | DRew-GCN | MRR: 0.3444±0.0017 |
| node-classification-on-pascalvoc-sp-1 | DRew-GatedGCN+LapPE | macro F1: 0.3314±0.0024 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.