Command Palette
Search for a command to run...
Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search
Xiangxiang Chu Tianbao Zhou Bo Zhang Jixiang Li

Abstract
Differentiable Architecture Search (DARTS) is now a widely disseminated weight-sharing neural architecture search method. However, it suffers from well-known performance collapse due to an inevitable aggregation of skip connections. In this paper, we first disclose that its root cause lies in an unfair advantage in exclusive competition. Through experiments, we show that if either of two conditions is broken, the collapse disappears. Thereby, we present a novel approach called Fair DARTS where the exclusive competition is relaxed to be collaborative. Specifically, we let each operation's architectural weight be independent of others. Yet there is still an important issue of discretization discrepancy. We then propose a zero-one loss to push architectural weights towards zero or one, which approximates an expected multi-hot solution. Our experiments are performed on two mainstream search spaces, and we derive new state-of-the-art results on CIFAR-10 and ImageNet. Our code is available on https://github.com/xiaomi-automl/fairdarts .
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| neural-architecture-search-on-cifar-10 | FairDARTS-a | FLOPS: 746M Parameters: 2.8M Search Time (GPU days): 0.25 Top-1 Error Rate: 2.54% |
| neural-architecture-search-on-imagenet | FairDARTS-C | MACs: 386M Top-1 Error Rate: 22.8 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.