HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Long-Tailed Classification with Gradual Balanced Loss and Adaptive Feature Generation

Zihan Zhang Xiang Xiang

Long-Tailed Classification with Gradual Balanced Loss and Adaptive Feature Generation

Abstract

The real-world data distribution is essentially long-tailed, which poses great challenge to the deep model. In this work, we propose a new method, Gradual Balanced Loss and Adaptive Feature Generator (GLAG) to alleviate imbalance. GLAG first learns a balanced and robust feature model with Gradual Balanced Loss, then fixes the feature model and augments the under-represented tail classes on the feature level with the knowledge from well-represented head classes. And the generated samples are mixed up with real training samples during training epochs. Gradual Balanced Loss is a general loss and it can combine with different decoupled training methods to improve the original performance. State-of-the-art results have been achieved on long-tail datasets such as CIFAR100-LT, ImageNetLT, and iNaturalist, which demonstrates the effectiveness of GLAG for long-tailed visual recognition.

Benchmarks

BenchmarkMethodologyMetrics
long-tail-learning-on-cifar-100-lt-r-10GLAG
Error Rate: 35.5
long-tail-learning-on-cifar-100-lt-r-100GLAG
Error Rate: 48.3

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Long-Tailed Classification with Gradual Balanced Loss and Adaptive Feature Generation | Papers | HyperAI