HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks

Pál András Papp Karolis Martinkus Lukas Faber Roger Wattenhofer

DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks

Abstract

This paper studies Dropout Graph Neural Networks (DropGNNs), a new approach that aims to overcome the limitations of standard GNN frameworks. In DropGNNs, we execute multiple runs of a GNN on the input graph, with some of the nodes randomly and independently dropped in each of these runs. Then, we combine the results of these runs to obtain the final result. We prove that DropGNNs can distinguish various graph neighborhoods that cannot be separated by message passing GNNs. We derive theoretical bounds for the number of runs required to ensure a reliable distribution of dropouts, and we prove several properties regarding the expressive capabilities and limits of DropGNNs. We experimentally validate our theoretical findings on expressiveness. Furthermore, we show that DropGNNs perform competitively on established GNN benchmarks.

Code Repositories

karolismart/dropgnn
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
graph-classification-on-ddDropGIN
Accuracy: 78.151±3.711
graph-classification-on-enzymesDropGIN
Accuracy: 65.128±4.117
graph-classification-on-imdb-bDropGIN
Accuracy: 75.7%
graph-classification-on-imdb-mDropGIN
Accuracy: 51.4%
graph-classification-on-mutagDropGIN
Accuracy: 90.4%
graph-classification-on-nci1DropGIN
Accuracy: 84.331±1.564
graph-classification-on-nci109DropGIN
Accuracy: 83.961±1.141
graph-classification-on-proteinsDropGIN
Accuracy: 76.3%
graph-classification-on-ptcDropGIN
Accuracy: 66.3%
graph-regression-on-esr2DropGIN
R2: 0.675±0.000
RMSE: 0.503±0.675
graph-regression-on-f2DropGIN
R2: 0.886±0.000
RMSE: 0.343±0.886
graph-regression-on-kitGINDrop
R2: 0.835±0.000
RMSE: 0.441±0.835
graph-regression-on-lipophilicityDropGIN
R2: 0.809±0.008
RMSE: 0.552±0.012
graph-regression-on-parp1DropGIN
R2: 0.920±0.000
RMSE: 0.354±0.920
graph-regression-on-pgrGINDrop
R2: 0.702±0.000
RMSE: 0.527±0.702
molecular-property-prediction-on-esolDropGIN
R2: 0.935±0.012
RMSE: 0.520±0.048
molecular-property-prediction-on-freesolvDropGIN
R2: 0.972±0.005
RMSE: 0.657±0.059

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks | Papers | HyperAI