HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Optimization of Graph Neural Networks with Natural Gradient Descent

Mohammad Rasool Izadi Yihao Fang Robert Stevenson Lizhen Lin

Optimization of Graph Neural Networks with Natural Gradient Descent

Abstract

In this work, we propose to employ information-geometric tools to optimize a graph neural network architecture such as the graph convolutional networks. More specifically, we develop optimization algorithms for the graph-based semi-supervised learning by employing the natural gradient information in the optimization process. This allows us to efficiently exploit the geometry of the underlying statistical model or parameter space for optimization and inference. To the best of our knowledge, this is the first work that has utilized the natural gradient for the optimization of graph neural networks that can be extended to other semi-supervised problems. Efficient computations algorithms are developed and extensive numerical studies are conducted to demonstrate the superior performance of our algorithms over existing algorithms such as ADAM and SGD.

Code Repositories

russellizadi/ssp
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
node-classification-on-citeseerSSP
Accuracy: 80.52 ± 0.14
node-classification-on-citeseer-with-publicSSP
Accuracy: 74.28 ± 0.67%
node-classification-on-coraSSP
Accuracy: 90.16% ± 0.59%
node-classification-on-cora-with-public-splitSSP
Accuracy: 82.84 ± 0.87%
node-classification-on-pubmedSSP
Accuracy: 89.36 ± 0.57
node-classification-on-pubmed-with-publicSSP
Accuracy: 80.06 ± 0.34%

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Optimization of Graph Neural Networks with Natural Gradient Descent | Papers | HyperAI