Learning With Noisy Labels On Cifar 10N 1

评估指标

Accuracy (mean)

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
ProMix96.97ProMix: Combating Label Noise via Maximizing Clean Sample Utility
PSSCL96.17PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels-
PGDF96.01Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
SOP+95.28Robust Training under Label Noise by Over-parameterization
ILL94.85Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
CORES*94.45Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR+94.43Early-Learning Regularization Prevents Memorization of Noisy Labels
GNL91.97Partial Label Supervision for Agnostic Generative Noisy Label Learning
ELR91.46Early-Learning Regularization Prevents Memorization of Noisy Labels
CAL90.93Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
Co-Teaching90.33Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
JoCoR90.30Combating noisy labels by agreement: A joint training method with co-regularization
Negative-LS90.29To Smooth or Not? When Label Smoothing Meets Noisy Labels
Divide-Mix90.18DivideMix: Learning with Noisy Labels as Semi-supervised Learning
Positive-LS89.80Does label smoothing mitigate label noise?-
F-div89.70When Optimizing $f$-divergence is Robust with Label Noise
Co-Teaching+89.70How does Disagreement Help Generalization against Label Corruption?
CORES89.66Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
Peer Loss89.06Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
T-Revision88.33Are Anchor Points Really Indispensable in Label-Noise Learning?
0 of 24 row(s) selected.
Learning With Noisy Labels On Cifar 10N 1 | SOTA | HyperAI超神经