HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

A Generic Graph-based Neural Architecture Encoding Scheme for Predictor-based NAS

Xuefei Ning Yin Zheng Tianchen Zhao Yu Wang Huazhong Yang

A Generic Graph-based Neural Architecture Encoding Scheme for Predictor-based NAS

Abstract

This work proposes a novel Graph-based neural ArchiTecture Encoding Scheme, a.k.a. GATES, to improve the predictor-based neural architecture search. Specifically, different from existing graph-based schemes, GATES models the operations as the transformation of the propagating information, which mimics the actual data processing of neural architecture. GATES is a more reasonable modeling of the neural architectures, and can encode architectures from both the "operation on node" and "operation on edge" cell search spaces consistently. Experimental results on various search spaces confirm GATES's effectiveness in improving the performance predictor. Furthermore, equipped with the improved performance predictor, the sample efficiency of the predictor-based neural architecture search (NAS) flow is boosted. Codes are available at https://github.com/walkerning/aw_nas.

Code Repositories

walkerning/aw_nas
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
architecture-search-on-cifar-10-imageGATES + c/o
Params: 4.1M
Percentage error: 2.58
neural-architecture-search-on-imagenetGATES
Accuracy: 75.9
Params: 5.6M
Top-1 Error Rate: 24.1

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
A Generic Graph-based Neural Architecture Encoding Scheme for Predictor-based NAS | Papers | HyperAI