Command Palette
Search for a command to run...
DARTS-PRIME: Regularization and Scheduling Improve Constrained Optimization in Differentiable NAS
Kaitlin Maile Erwan Lecarpentier Hervé Luga Dennis G. Wilson

Abstract
Differentiable Architecture Search (DARTS) is a recent neural architecture search (NAS) method based on a differentiable relaxation. Due to its success, numerous variants analyzing and improving parts of the DARTS framework have recently been proposed. By considering the problem as a constrained bilevel optimization, we present and analyze DARTS-PRIME, a variant including improvements to architectural weight update scheduling and regularization towards discretization. We propose a dynamic schedule based on per-minibatch network information to make architecture updates more informed, as well as proximity regularization to promote well-separated discretization. Our results in multiple domains show that DARTS-PRIME improves both performance and reliability, comparable to state-of-the-art in differentiable NAS.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| neural-architecture-search-on-cifar-10 | DARTS-PRIME | Parameters: 3.7M Search Time (GPU days): 0.5 Top-1 Error Rate: 2.62% |
| neural-architecture-search-on-cifar-100-1 | DARTS-PRIME | PARAMS: 3.16M Percentage Error: 17.44 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.