HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Combining Label Propagation and Simple Models Out-performs Graph Neural Networks

Qian Huang; Horace He; Abhay Singh; Ser-Nam Lim; Austin R. Benson

Combining Label Propagation and Simple Models Out-performs Graph Neural Networks

Abstract

Graph Neural Networks (GNNs) are the predominant technique for learning over graphs. However, there is relatively little understanding of why GNNs are successful in practice and whether they are necessary for good performance. Here, we show that for many standard transductive node classification benchmarks, we can exceed or match the performance of state-of-the-art GNNs by combining shallow models that ignore the graph structure with two simple post-processing steps that exploit correlation in the label structure: (i) an "error correlation" that spreads residual errors in training data to correct errors in test data and (ii) a "prediction correlation" that smooths the predictions on the test data. We call this overall procedure Correct and Smooth (C&S), and the post-processing steps are implemented via simple modifications to standard label propagation techniques from early graph-based semi-supervised learning methods. Our approach exceeds or nearly matches the performance of state-of-the-art GNNs on a wide variety of benchmarks, with just a small fraction of the parameters and orders of magnitude faster runtime. For instance, we exceed the best known GNN performance on the OGB-Products dataset with 137 times fewer parameters and greater than 100 times less training time. The performance of our methods highlights how directly incorporating label information into the learning algorithm (as was done in traditional techniques) yields easy and substantial performance gains. We can also incorporate our techniques into big GNN models, providing modest gains. Our code for the OGB results is at https://github.com/Chillee/CorrectAndSmooth.

Code Repositories

ytchx1999/GCN_res-CS-v2
pytorch
Mentioned in GitHub
Chillee/CorrectAndSmoothOGB
pytorch
Mentioned in GitHub
ytchx1999/PyG-GCN_res-CS
pytorch
Mentioned in GitHub
xnuohz/CorrectAndSmooth-dgl
pytorch
Mentioned in GitHub
sangyx/gtrick/tree/main/benchmark/pyg
pytorch
Mentioned in GitHub
CUAI/CorrectAndSmooth
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
node-classification-on-geniusC&S 1-hop 
Accuracy: 82.93 ± 0.15
node-classification-on-geniusC&S 2-hop
Accuracy: 84.94 ± 0.49
node-classification-on-non-homophilic-13C&S 1-hop 
1:1 Accuracy: 74.28 ± 1.19
node-classification-on-non-homophilic-13C&S 2-hop
1:1 Accuracy: 78.40 ± 3.12
node-classification-on-non-homophilic-14C&S 2-hop
1:1 Accuracy: 84.94 ± 0.49
node-classification-on-non-homophilic-14C&S 1-hop 
1:1 Accuracy: 82.93 ± 0.15
node-classification-on-non-homophilic-15C&S 1-hop 
1:1 Accuracy: 64.86 ± 0.27
node-classification-on-non-homophilic-15C&S 2-hop
1:1 Accuracy: 65.02 ± 0.16
node-classification-on-non-homophilic-6C&S(2hop)
1:1 Accuracy: 64.52±0.62
node-classification-on-non-homophilic-6C&S(1hop)
1:1 Accuracy: 64.60±0.57
node-classification-on-penn94C&S 2-hop
Accuracy: 78.40 ± 3.12
node-classification-on-penn94C&S 1-hop 
Accuracy: 74.28 ± 1.19

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Combining Label Propagation and Simple Models Out-performs Graph Neural Networks | Papers | HyperAI