Command Palette
Search for a command to run...
Towards Calibrated Model for Long-Tailed Visual Recognition from Prior Perspective
Zhengzhuo Xu Zenghao Chai Chun Yuan

Abstract
Real-world data universally confronts a severe class-imbalance problem and exhibits a long-tailed distribution, i.e., most labels are associated with limited instances. The naïve models supervised by such datasets would prefer dominant labels, encounter a serious generalization challenge and become poorly calibrated. We propose two novel methods from the prior perspective to alleviate this dilemma. First, we deduce a balance-oriented data augmentation named Uniform Mixup (UniMix) to promote mixup in long-tailed scenarios, which adopts advanced mixing factor and sampler in favor of the minority. Second, motivated by the Bayesian theory, we figure out the Bayes Bias (Bayias), an inherent bias caused by the inconsistency of prior, and compensate it as a modification on standard cross-entropy loss. We further prove that both the proposed methods ensure the classification calibration theoretically and empirically. Extensive experiments verify that our strategies contribute to a better-calibrated model, and their combination achieves state-of-the-art performance on CIFAR-LT, ImageNet-LT, and iNaturalist 2018.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| long-tail-learning-on-cifar-10-lt-r-10 | Prior-LT | Error Rate: 12.20 |
| long-tail-learning-on-cifar-10-lt-r-10 | UniMix+Bayias | Error Rate: 10.34 |
| long-tail-learning-on-cifar-100-lt-r-10 | UniMix+Bayias (ResNet-32) | Error Rate: 38.75 |
| long-tail-learning-on-cifar-100-lt-r-100 | Prior-LT | Error Rate: 53.59 |
| long-tail-learning-on-cifar-100-lt-r-100 | UniMix+Bayias (ResNet-32) | Error Rate: 54.55 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.