HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels

Zhilu Zhang; Mert R. Sabuncu

Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels

Abstract

Deep neural networks (DNNs) have achieved tremendous success in a variety of applications across many disciplines. Yet, their superior performance comes with the expensive cost of requiring correctly annotated large-scale datasets. Moreover, due to DNNs' rich capacity, errors in training labels can hamper performance. To combat this problem, mean absolute error (MAE) has recently been proposed as a noise-robust alternative to the commonly-used categorical cross entropy (CCE) loss. However, as we show in this paper, MAE can perform poorly with DNNs and challenging datasets. Here, we present a theoretically grounded set of noise-robust loss functions that can be seen as a generalization of MAE and CCE. Proposed loss functions can be readily applied with any existing DNN architecture and algorithm, while yielding good performance in a wide range of noisy label scenarios. We report results from experiments conducted with CIFAR-10, CIFAR-100 and FASHION-MNIST datasets and synthetically generated noisy labels.

Code Repositories

arghosh/noisy_label_pretrain
pytorch
Mentioned in GitHub
dmizr/phuber
pytorch
Mentioned in GitHub
AlanChou/Truncated-Loss
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
image-classification-on-clothing1mGCE
Accuracy: 69.75%
learning-with-noisy-labels-on-cifar-100nGCE
Accuracy (mean): 56.73
learning-with-noisy-labels-on-cifar-10nGCE
Accuracy (mean): 87.85
learning-with-noisy-labels-on-cifar-10n-1GCE
Accuracy (mean): 87.61
learning-with-noisy-labels-on-cifar-10n-2GCE
Accuracy (mean): 87.70
learning-with-noisy-labels-on-cifar-10n-3GCE
Accuracy (mean): 87.58
learning-with-noisy-labels-on-cifar-10n-worstGCE
Accuracy (mean): 80.66

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels | Papers | HyperAI