Command Palette
Search for a command to run...
Long-Tailed Classification with Gradual Balanced Loss and Adaptive Feature Generation
Zihan Zhang Xiang Xiang

Abstract
The real-world data distribution is essentially long-tailed, which poses great challenge to the deep model. In this work, we propose a new method, Gradual Balanced Loss and Adaptive Feature Generator (GLAG) to alleviate imbalance. GLAG first learns a balanced and robust feature model with Gradual Balanced Loss, then fixes the feature model and augments the under-represented tail classes on the feature level with the knowledge from well-represented head classes. And the generated samples are mixed up with real training samples during training epochs. Gradual Balanced Loss is a general loss and it can combine with different decoupled training methods to improve the original performance. State-of-the-art results have been achieved on long-tail datasets such as CIFAR100-LT, ImageNetLT, and iNaturalist, which demonstrates the effectiveness of GLAG for long-tailed visual recognition.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| long-tail-learning-on-cifar-100-lt-r-10 | GLAG | Error Rate: 35.5 |
| long-tail-learning-on-cifar-100-lt-r-100 | GLAG | Error Rate: 48.3 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.