Command Palette
Search for a command to run...
Liron Bergman Niv Cohen Yedid Hoshen

Abstract
Nearest neighbors is a successful and long-standing technique for anomaly detection. Significant progress has been recently achieved by self-supervised deep methods (e.g. RotNet). Self-supervised features however typically under-perform Imagenet pre-trained features. In this work, we investigate whether the recent progress can indeed outperform nearest-neighbor methods operating on an Imagenet pretrained feature space. The simple nearest-neighbor based-approach is experimentally shown to outperform self-supervised methods in: accuracy, few shot generalization, training time and noise robustness while making fewer assumptions on image distributions.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| anomaly-detection-on-anomaly-detection-on-1 | DN2 CLIP ViT | Network: ViT ROC-AUC: 93.8 |
| anomaly-detection-on-anomaly-detection-on-2 | DN2 CLIP ViT | Network: ViT ROC-AUC: 93.2 |
| anomaly-detection-on-one-class-cifar-10 | DN2 | AUROC: 92.5 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.