HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search

Han Xiao Ziwei Wang Zheng Zhu Jie Zhou Jiwen Lu

Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search

Abstract

In this paper, we propose a Shapley value based method to evaluate operation contribution (Shapley-NAS) for neural architecture search. Differentiable architecture search (DARTS) acquires the optimal architectures by optimizing the architecture parameters with gradient descent, which significantly reduces the search cost. However, the magnitude of architecture parameters updated by gradient descent fails to reveal the actual operation importance to the task performance and therefore harms the effectiveness of obtained architectures. By contrast, we propose to evaluate the direct influence of operations on validation accuracy. To deal with the complex relationships between supernet components, we leverage Shapley value to quantify their marginal contributions by considering all possible combinations. Specifically, we iteratively optimize the supernet weights and update the architecture parameters by evaluating operation contributions via Shapley value, so that the optimal architectures are derived by selecting the operations that contribute significantly to the tasks. Since the exact computation of Shapley value is NP-hard, the Monte-Carlo sampling based algorithm with early truncation is employed for efficient approximation, and the momentum update mechanism is adopted to alleviate fluctuation of the sampling process. Extensive experiments on various datasets and various search spaces show that our Shapley-NAS outperforms the state-of-the-art methods by a considerable margin with light search cost. The code is available at https://github.com/Euphoria16/Shapley-NAS.git

Code Repositories

euphoria16/shapley-nas
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
neural-architecture-search-on-cifar-10Shapley-NAS(best)
Parameters: 3.6M
Top-1 Error Rate: 2.43%
neural-architecture-search-on-imagenetShapley-NAS
MACs: 582M
Params: 5.4M
Top-1 Error Rate: 23.9
neural-architecture-search-on-nas-bench-201Shapley-NAS
Accuracy (Test): 46.85
Accuracy (Val): 46.57
neural-architecture-search-on-nas-bench-201-1Shapley-NAS
Accuracy (Test): 94.37
Accuracy (Val): 91.61
neural-architecture-search-on-nas-bench-201-2Shapley-NAS
Accuracy (Test): 73.51
Accuracy (Val): 73.49

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search | Papers | HyperAI