HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Disentangling Label Distribution for Long-tailed Visual Recognition

Youngkyu Hong Seungju Han Kwanghee Choi Seokjun Seo Beomsu Kim Buru Chang

Disentangling Label Distribution for Long-tailed Visual Recognition

Abstract

The current evaluation protocol of long-tailed visual recognition trains the classification model on the long-tailed source label distribution and evaluates its performance on the uniform target label distribution. Such protocol has questionable practicality since the target may also be long-tailed. Therefore, we formulate long-tailed visual recognition as a label shift problem where the target and source label distributions are different. One of the significant hurdles in dealing with the label shift problem is the entanglement between the source label distribution and the model prediction. In this paper, we focus on disentangling the source label distribution from the model prediction. We first introduce a simple but overlooked baseline method that matches the target label distribution by post-processing the model prediction trained by the cross-entropy loss and the Softmax function. Although this method surpasses state-of-the-art methods on benchmark datasets, it can be further improved by directly disentangling the source label distribution from the model prediction in the training phase. Thus, we propose a novel method, LAbel distribution DisEntangling (LADE) loss based on the optimal bound of Donsker-Varadhan representation. LADE achieves state-of-the-art performance on benchmark datasets such as CIFAR-100-LT, Places-LT, ImageNet-LT, and iNaturalist 2018. Moreover, LADE outperforms existing methods on various shifted target label distributions, showing the general adaptability of our proposed method.

Code Repositories

beierzhu/xerm
pytorch
Mentioned in GitHub
hyperconnect/LADE
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
image-classification-on-inaturalist-2018LADE
Top-1 Accuracy: 70.0%
long-tail-learning-on-cifar-10-lt-r-10LADE
Error Rate: 11.22
long-tail-learning-on-cifar-100-lt-r-10LADE
Error Rate: 38.3
long-tail-learning-on-cifar-100-lt-r-100LADE
Error Rate: 54.6
long-tail-learning-on-imagenet-ltLADE
Top-1 Accuracy: 53.0
long-tail-learning-on-inaturalist-2018LADE
Top-1 Accuracy: 70.0%
long-tail-learning-on-places-ltLADE
Top-1 Accuracy: 38.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Disentangling Label Distribution for Long-tailed Visual Recognition | Papers | HyperAI