HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

AlphaX: eXploring Neural Architectures with Deep Neural Networks and Monte Carlo Tree Search

Linnan Wang; Yiyang Zhao; Yuu Jinnai; Yuandong Tian; Rodrigo Fonseca

AlphaX: eXploring Neural Architectures with Deep Neural Networks and Monte Carlo Tree Search

Abstract

Neural Architecture Search (NAS) has shown great success in automating the design of neural networks, but the prohibitive amount of computations behind current NAS methods requires further investigations in improving the sample efficiency and the network evaluation cost to get better results in a shorter time. In this paper, we present a novel scalable Monte Carlo Tree Search (MCTS) based NAS agent, named AlphaX, to tackle these two aspects. AlphaX improves the search efficiency by adaptively balancing the exploration and exploitation at the state level, and by a Meta-Deep Neural Network (DNN) to predict network accuracies for biasing the search toward a promising region. To amortize the network evaluation cost, AlphaX accelerates MCTS rollouts with a distributed design and reduces the number of epochs in evaluating a network by transfer learning guided with the tree structure in MCTS. In 12 GPU days and 1000 samples, AlphaX found an architecture that reaches 97.84\% top-1 accuracy on CIFAR-10, and 75.5\% top-1 accuracy on ImageNet, exceeding SOTA NAS methods in both the accuracy and sampling efficiency. Particularly, we also evaluate AlphaX on NASBench-101, a large scale NAS dataset; AlphaX is 3x and 2.8x more sample efficient than Random Search and Regularized Evolution in finding the global optimum. Finally, we show the searched architecture improves a variety of vision applications from Neural Style Transfer, to Image Captioning and Object Detection.

Code Repositories

linnanwang/AlphaX-NASBench101
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
architecture-search-on-cifar-10-imageAlphaX-1 (cutout NASNet)
Params: 3.59M
neural-architecture-search-on-cifar-10AlphaX-1 (cutout NASNet)
Search Time (GPU days): 224
Top-1 Error Rate: 2.82%
neural-architecture-search-on-imagenetAlphaX-1
Accuracy: 75.5
Params: 5.4M
Top-1 Error Rate: 24.5

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
AlphaX: eXploring Neural Architectures with Deep Neural Networks and Monte Carlo Tree Search | Papers | HyperAI