Command Palette
Search for a command to run...
{Raymond Ptucha Sunand Raghupathi Naga Durga Harish Kanamarlapudi Rohan Dhamdhere Miguel Dominguez}
Abstract
Architecture design and hyperparameter selection for deep neural networks often involves guesswork. The parameter space is too large to try all possibilities, meaning one often settles for a suboptimal solution. Some works have proposed automatic architecture and hyperparameter search, but are constrained to image applications. We propose an evolution framework for graph data which is extensible to generic graphs. Our evolution mutates a population of neural networks to search the architecture and hyperparameter space. At each stage of the neuroevolution process, neural network layers can be added or removed, hyperparameters can be adjusted, or additional epochs of training can be applied. Probabilities of the mutation selection based on recent successes help guide the learning process for efficient and accurate learning. We achieve state-of-the-art on MUTAG protein classification from a small population of 10 networks and gain interesting insight into how to build effective network architectures incrementally.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| graph-classification-on-enzymes | Evolution of Graph Classifiers | Accuracy: 55.67 |
| graph-classification-on-mutag | Evolution of Graph Classifiers | Accuracy: 100.00% Accuracy (10-fold): 100 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.