HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Bonsai-Net: One-Shot Neural Architecture Search via Differentiable Pruners

Rob Geada Dennis Prangle Andrew Stephen McGough

Bonsai-Net: One-Shot Neural Architecture Search via Differentiable Pruners

Abstract

One-shot Neural Architecture Search (NAS) aims to minimize the computational expense of discovering state-of-the-art models. However, in the past year attention has been drawn to the comparable performance of naive random search across the same search spaces used by leading NAS algorithms. To address this, we explore the effects of drastically relaxing the NAS search space, and we present Bonsai-Net, an efficient one-shot NAS method to explore our relaxed search space. Bonsai-Net is built around a modified differential pruner and can consistently discover state-of-the-art architectures that are significantly better than random search with fewer parameters than other state-of-the-art methods. Additionally, Bonsai-Net performs simultaneous model search and training, dramatically reducing the total time it takes to generate fully-trained models from scratch.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
neural-architecture-search-on-cifar-10Bonsai-Net
Parameters: 2.9M
Search Time (GPU days): 0.10
Top-1 Error Rate: 3.35%

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Bonsai-Net: One-Shot Neural Architecture Search via Differentiable Pruners | Papers | HyperAI