HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation

Shichang Zhang Yozen Liu Yizhou Sun Neil Shah

Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation

Abstract

Graph Neural Networks (GNNs) are popular for graph machine learning and have shown great results on wide node classification tasks. Yet, they are less popular for practical deployments in the industry owing to their scalability challenges incurred by data dependency. Namely, GNN inference depends on neighbor nodes multiple hops away from the target, and fetching them burdens latency-constrained applications. Existing inference acceleration methods like pruning and quantization can speed up GNNs by reducing Multiplication-and-ACcumulation (MAC) operations, but the improvements are limited given the data dependency is not resolved. Conversely, multi-layer perceptrons (MLPs) have no graph dependency and infer much faster than GNNs, even though they are less accurate than GNNs for node classification in general. Motivated by these complementary strengths and weaknesses, we bring GNNs and MLPs together via knowledge distillation (KD). Our work shows that the performance of MLPs can be improved by large margins with GNN KD. We call the distilled MLPs Graph-less Neural Networks (GLNNs) as they have no inference graph dependency. We show that GLNNs with competitive accuracy infer faster than GNNs by 146X-273X and faster than other acceleration methods by 14X-27X. Under a production setting involving both transductive and inductive predictions across 7 datasets, GLNN accuracies improve over stand-alone MLPs by 12.36% on average and match GNNs on 6/7 datasets. Comprehensive analysis shows when and why GLNNs can achieve competitive accuracies to GNNs and suggests GLNN as a handy choice for latency-constrained applications.

Code Repositories

snap-research/graphless-neural-networks
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
node-classification-on-amz-computersGLNN
Accuracy: 83.03± 1.87%
node-classification-on-amz-photoGLNN
Accuracy: 92.11± 1.08%
node-classification-on-citeseerGLNN
Accuracy: 71.77± 2.01
node-classification-on-coraGLNN
Accuracy: 80.54± 1.35%
node-classification-on-pubmedGLNN
Accuracy: 75.42 ± 2.31

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation | Papers | HyperAI