Learning With Noisy Labels On Cifar 10N 3

评估指标

Accuracy (mean)

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
PSSCL96.49PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels-
SOP+95.39Robust Training under Label Noise by Over-parameterization
ILL95.13Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
CORES*94.74Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR+94.34Early-Learning Regularization Prevents Memorization of Noisy Labels
GNL91.83Partial Label Supervision for Agnostic Generative Noisy Label Learning
ELR91.41Early-Learning Regularization Prevents Memorization of Noisy Labels
CAL90.74Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
Co-Teaching90.15Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
Negative-LS90.13To Smooth or Not? When Label Smoothing Meets Noisy Labels
JoCoR90.11Combating noisy labels by agreement: A joint training method with co-regularization
Divide-Mix89.97DivideMix: Learning with Noisy Labels as Semi-supervised Learning
Positive-LS89.82Does label smoothing mitigate label noise?-
CORES89.79Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
F-div89.55When Optimizing $f$-divergence is Robust with Label Noise
Co-Teaching+89.54How does Disagreement Help Generalization against Label Corruption?
Peer Loss88.57Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
VolMinNet88.19Provably End-to-end Label-Noise Learning without Anchor Points
T-Revision87.79Are Anchor Points Really Indispensable in Label-Noise Learning?
GCE87.58Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
0 of 23 row(s) selected.
Learning With Noisy Labels On Cifar 10N 3 | SOTA | HyperAI超神经