Command Palette
Search for a command to run...
Zhisheng Zhong Jiequan Cui Shu Liu Jiaya Jia

Abstract
Deep neural networks may perform poorly when training datasets are heavily class-imbalanced. Recently, two-stage methods decouple representation learning and classifier learning to improve performance. But there is still the vital issue of miscalibration. To address it, we design two methods to improve calibration and performance in such scenarios. Motivated by the fact that predicted probability distributions of classes are highly related to the numbers of class instances, we propose label-aware smoothing to deal with different degrees of over-confidence for classes and improve classifier learning. For dataset bias between these two stages due to different samplers, we further propose shifted batch normalization in the decoupling framework. Our proposed methods set new records on multiple popular long-tailed recognition benchmark datasets, including CIFAR-10-LT, CIFAR-100-LT, ImageNet-LT, Places-LT, and iNaturalist 2018. Code will be available at https://github.com/Jia-Research-Lab/MiSLAS.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| long-tail-learning-on-cifar-10-lt-r-10 | MiSLAS | Error Rate: 10 |
| long-tail-learning-on-cifar-10-lt-r-100 | MiSLAS | Error Rate: 17.9 |
| long-tail-learning-on-cifar-100-lt-r-10 | MiSLAS | Error Rate: 36.8 |
| long-tail-learning-on-cifar-100-lt-r-100 | MiSLAS | Error Rate: 53 |
| long-tail-learning-on-cifar-100-lt-r-50 | MiSLAS | Error Rate: 47.7 |
| long-tail-learning-on-imagenet-lt | MiSLAS | Top-1 Accuracy: 52.7 |
| long-tail-learning-on-inaturalist-2018 | MiSLAS | Top-1 Accuracy: 71.6% |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.