Command Palette
Search for a command to run...
Wenzheng Feng Jie Zhang Yuxiao Dong Yu Han Huanbo Luan Qian Xu Qiang Yang Evgeny Kharlamov Jie Tang

Abstract
We study the problem of semi-supervised learning on graphs, for which graph neural networks (GNNs) have been extensively explored. However, most existing GNNs inherently suffer from the limitations of over-smoothing, non-robustness, and weak-generalization when labeled nodes are scarce. In this paper, we propose a simple yet effective framework -- GRAPH RANDOM NEURAL NETWORKS (GRAND) -- to address these issues. In GRAND, we first design a random propagation strategy to perform graph data augmentation. Then we leverage consistency regularization to optimize the prediction consistency of unlabeled nodes across different data augmentations. Extensive experiments on graph benchmark datasets suggest that GRAND significantly outperforms state-of-the-art GNN baselines on semi-supervised node classification. Finally, we show that GRAND mitigates the issues of over-smoothing and non-robustness, exhibiting better generalization behavior than existing GNNs. The source code of GRAND is publicly available at https://github.com/Grand20/grand.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| node-classification-on-citeseer-with-public | GRAND | Accuracy: 75.4 ± 0.4 |
| node-classification-on-cora-with-public-split | GRAND | Accuracy: 85.4 ± 0.4 |
| node-classification-on-pubmed-with-public | GRAND | Accuracy: 82.7 ± 0.6 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.