Learning With Noisy Labels On Cifar 10N Worst

评估指标

Accuracy (mean)

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
ProMix96.16ProMix: Combating Label Noise via Maximizing Clean Sample Utility
PSSCL95.12PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels-
PGDF93.65Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
ILL93.58Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
SOP+93.24Robust Training under Label Noise by Over-parameterization
Divide-Mix92.56DivideMix: Learning with Noisy Labels as Semi-supervised Learning
CORES*91.66Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR+91.09Early-Learning Regularization Prevents Memorization of Noisy Labels
GNL86.99Partial Label Supervision for Agnostic Generative Noisy Label Learning
CAL85.36Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
Co-Teaching83.83Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
CORES83.60Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR83.58Early-Learning Regularization Prevents Memorization of Noisy Labels
JoCoR83.37Combating noisy labels by agreement: A joint training method with co-regularization
Co-Teaching+83.26How does Disagreement Help Generalization against Label Corruption?
Negative-LS82.99Understanding Generalized Label Smoothing when Learning with Noisy Labels-
Positive-LS82.76Does label smoothing mitigate label noise?-
F-div82.53When Optimizing $f$-divergence is Robust with Label Noise
Peer Loss82.53Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
GCE80.66Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
0 of 25 row(s) selected.
Learning With Noisy Labels On Cifar 10N Worst | SOTA | HyperAI超神经