Learning With Noisy Labels On Cifar 100N

评估指标

Accuracy (mean)

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
PGDF74.08Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
ProMix73.39ProMix: Combating Label Noise via Maximizing Clean Sample Utility
PSSCL72.00PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels-
Divide-Mix71.13DivideMix: Learning with Noisy Labels as Semi-supervised Learning
SOP+67.81Robust Training under Label Noise by Over-parameterization
ELR+66.72Early-Learning Regularization Prevents Memorization of Noisy Labels
ILL65.84Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
CAL61.73Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
CORES61.15Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
Co-Teaching60.37Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
JoCoR59.97Combating noisy labels by agreement: A joint training method with co-regularization
ELR58.94Early-Learning Regularization Prevents Memorization of Noisy Labels
Negative-LS58.59To Smooth or Not? When Label Smoothing Meets Noisy Labels
Co-Teaching+57.88How does Disagreement Help Generalization against Label Corruption?
VolMinNet57.80Provably End-to-end Label-Noise Learning without Anchor Points
Peer Loss57.59Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
Backward-T57.14Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
F-div57.10When Optimizing $f$-divergence is Robust with Label Noise
Forward-T57.01Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
GCE56.73Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
0 of 24 row(s) selected.
Learning With Noisy Labels On Cifar 100N | SOTA | HyperAI超神经