Learning With Noisy Labels On Cifar 10N 2

评估指标

Accuracy (mean)

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
PSSCL96.21PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels-
SOP95.31Robust Training under Label Noise by Over-parameterization
ILL95.04Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
CORES*94.88Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR+94.20Early-Learning Regularization Prevents Memorization of Noisy Labels
ELR91.61Early-Learning Regularization Prevents Memorization of Noisy Labels
GNL91.42Partial Label Supervision for Agnostic Generative Noisy Label Learning
Divide-Mix90.90DivideMix: Learning with Noisy Labels as Semi-supervised Learning
CAL90.75Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
Negative-LS90.37To Smooth or Not? When Label Smoothing Meets Noisy Labels
Co-Teaching90.30Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
JoCoR90.21Combating noisy labels by agreement: A joint training method with co-regularization
CORES89.91Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
F-div89.79When Optimizing $f$-divergence is Robust with Label Noise
Co-Teaching+89.47How does Disagreement Help Generalization against Label Corruption?
Positive-LS89.35Does label smoothing mitigate label noise?-
Peer Loss88.76Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
VolMinNet88.27Provably End-to-end Label-Noise Learning without Anchor Points
T-Revision87.71Are Anchor Points Really Indispensable in Label-Noise Learning?
GCE87.70Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
0 of 23 row(s) selected.
Learning With Noisy Labels On Cifar 10N 2 | SOTA | HyperAI超神经