HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Pure Noise to the Rescue of Insufficient Data: Improving Imbalanced Classification by Training on Random Noise Images

Shiran Zada Itay Benou Michal Irani

Pure Noise to the Rescue of Insufficient Data: Improving Imbalanced Classification by Training on Random Noise Images

Abstract

Despite remarkable progress on visual recognition tasks, deep neural-nets still struggle to generalize well when training data is scarce or highly imbalanced, rendering them extremely vulnerable to real-world examples. In this paper, we present a surprisingly simple yet highly effective method to mitigate this limitation: using pure noise images as additional training data. Unlike the common use of additive noise or adversarial noise for data augmentation, we propose an entirely different perspective by directly training on pure random noise images. We present a new Distribution-Aware Routing Batch Normalization layer (DAR-BN), which enables training on pure noise images in addition to natural images within the same network. This encourages generalization and suppresses overfitting. Our proposed method significantly improves imbalanced classification performance, obtaining state-of-the-art results on a large variety of long-tailed image classification datasets (CIFAR-10-LT, CIFAR-100-LT, ImageNet-LT, Places-LT, and CelebA-5). Furthermore, our method is extremely simple and easy to use as a general new augmentation tool (on top of existing augmentations), and can be incorporated in any training scheme. It does not require any specialized data generation or training procedures, thus keeping training fast and efficient.

Code Repositories

shiranzada/pure-noise
Official
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
long-tail-learning-on-celeba-5OPeN (WideResNet-28-10)
Error Rate: 19.1
long-tail-learning-on-cifar-10-lt-r-100OPeN (WideResNet-28-10)
Error Rate: 13.9
long-tail-learning-on-cifar-10-lt-r-50OPeN (WideResNet-28-10)
Error Rate: 10.8
long-tail-learning-on-cifar-100-lt-r-100OPeN (WideResNet-28-10)
Error Rate: 45.8
long-tail-learning-on-cifar-100-lt-r-50OPeN (WideResNet-28-10)
Error Rate: 40.2
long-tail-learning-on-imagenet-ltOPeN (ResNeXt-50)
Top-1 Accuracy: 55.1
long-tail-learning-on-places-ltOPeN (ResNet-152)
Top-1 Accuracy: 40.5

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Pure Noise to the Rescue of Insufficient Data: Improving Imbalanced Classification by Training on Random Noise Images | Papers | HyperAI