Command Palette
Search for a command to run...
Yikai Zhang Songzhu Zheng Pengxiang Wu Mayank Goswami Chao Chen

Abstract
Label noise is frequently observed in real-world large-scale datasets. The noise is introduced due to a variety of reasons; it is heterogeneous and feature-dependent. Most existing approaches to handling noisy labels fall into two categories: they either assume an ideal feature-independent noise, or remain heuristic without theoretical guarantees. In this paper, we propose to target a new family of feature-dependent label noise, which is much more general than commonly used i.i.d. label noise and encompasses a broad spectrum of noise patterns. Focusing on this general noise family, we propose a progressive label correction algorithm that iteratively corrects labels and refines the model. We provide theoretical guarantees showing that for a wide variety of (unknown) noise patterns, a classifier trained with this strategy converges to be consistent with the Bayes classifier. In experiments, our method outperforms SOTA baselines and is robust to various noise types and levels.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| learning-with-noisy-labels-on-animal | Cross Entropy | Accuracy: 79.4 ImageNet Pretrained: NO Network: Vgg19-BN |
| learning-with-noisy-labels-on-animal | PLC | Accuracy: 83.4 ImageNet Pretrained: NO Network: Vgg19-BN |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.