Command Palette
Search for a command to run...
Biswadeep Chakraborty Saibal Mukhopadhyay

Abstract
We present a Model Uncertainty-aware Differentiable ARchiTecture Search ($μ$DARTS) that optimizes neural networks to simultaneously achieve high accuracy and low uncertainty. We introduce concrete dropout within DARTS cells and include a Monte-Carlo regularizer within the training loss to optimize the concrete dropout probabilities. A predictive variance term is introduced in the validation loss to enable searching for architecture with minimal model uncertainty. The experiments on CIFAR10, CIFAR100, SVHN, and ImageNet verify the effectiveness of $μ$DARTS in improving accuracy and reducing uncertainty compared to existing DARTS methods. Moreover, the final architecture obtained from $μ$DARTS shows higher robustness to noise at the input image and model parameters compared to the architecture obtained from existing DARTS methods.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| neural-architecture-search-on-cifar-10 | μDARTS | FLOPS: 602M Search Time (GPU days): 0.1 Top-1 Error Rate: 3.277% |
| neural-architecture-search-on-cifar-100-1 | μDARTS | PARAMS: 602M Percentage Error: 19.39 Search Time (GPU days): 1.57 |
| neural-architecture-search-on-imagenet | μDARTS | Accuracy: 78.76 Params: 602M Top-1 Error Rate: 21.24 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.