Learning With Noisy Labels On Cifar 10N

评估指标

Accuracy (mean)

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
ProMix97.39ProMix: Combating Label Noise via Maximizing Clean Sample Utility
PSSCL96.41PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels-
PGDF96.11Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
SOP+95.61Robust Training under Label Noise by Over-parameterization
ILL95.47Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
CORES*95.25Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
Divide-Mix95.01DivideMix: Learning with Noisy Labels as Semi-supervised Learning
ELR+94.83Early-Learning Regularization Prevents Memorization of Noisy Labels
PES (Semi)94.66Understanding and Improving Early Stopping for Learning with Noisy Labels
GNL92.57Partial Label Supervision for Agnostic Generative Noisy Label Learning
ELR92.38Early-Learning Regularization Prevents Memorization of Noisy Labels
CAL91.97Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
Negative-LS91.97To Smooth or Not? When Label Smoothing Meets Noisy Labels
F-div91.64When Optimizing $f$-divergence is Robust with Label Noise
Positive-LS91.57Does label smoothing mitigate label noise?-
JoCoR91.44Combating noisy labels by agreement: A joint training method with co-regularization
CORES91.23Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
Co-Teaching91.20Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
Peer Loss90.75Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
Co-Teaching+90.61How does Disagreement Help Generalization against Label Corruption?
0 of 26 row(s) selected.
Learning With Noisy Labels On Cifar 10N | SOTA | HyperAI超神经