HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Balanced Mixture of SuperNets for Learning the CNN Pooling Architecture

Mehraveh Javan Matthew Toews Marco Pedersoli

Balanced Mixture of SuperNets for Learning the CNN Pooling Architecture

Abstract

Downsampling layers, including pooling and strided convolutions, are crucial components of the convolutional neural network architecture that determine both the granularity/scale of image feature analysis as well as the receptive field size of a given layer. To fully understand this problem, we analyse the performance of models independently trained with each pooling configurations on CIFAR10, using a ResNet20 network, and show that the position of the downsampling layers can highly influence the performance of a network and predefined downsampling configurations are not optimal. Network Architecture Search (NAS) might be used to optimize downsampling configurations as an hyperparameter. However, we find that common one-shot NAS based on a single SuperNet does not work for this problem. We argue that this is because a SuperNet trained for finding the optimal pooling configuration fully shares its parameters among all pooling configurations. This makes its training hard, because learning some configurations can harm the performance of others. Therefore, we propose a balanced mixture of SuperNets that automatically associates pooling configurations to different weight models and helps to reduce the weight-sharing and inter-influence of pooling configurations on the SuperNet parameters. We evaluate our proposed approach on CIFAR10, CIFAR100, as well as Food101 and show that in all cases, our model outperforms other approaches and improves over the default pooling configurations.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
neural-architecture-search-on-cifar-10Balanced Mixture
Accuracy (% ): 91.55
neural-architecture-search-on-cifar-100-1Balanced Mixture
Accuracy (% ): 79.61
neural-architecture-search-on-food-101Balanced Mixture
Accuracy (% ): 84.73

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Balanced Mixture of SuperNets for Learning the CNN Pooling Architecture | Papers | HyperAI