Command Palette
Search for a command to run...
PCNN: Probable-Class Nearest-Neighbor Explanations Improve Fine-Grained Image Classification Accuracy for AIs and Humans
Giang Valerie Chen Mohammad Reza Taesiri Anh Totti Nguyen

Abstract
Nearest neighbors (NN) are traditionally used to compute final decisions, e.g., in Support Vector Machines or k-NN classifiers, and to provide users with explanations for the model's decision. In this paper, we show a novel utility of nearest neighbors: To improve predictions of a frozen, pretrained image classifier C. We leverage an image comparator S that (1) compares the input image with NN images from the top-K most probable classes given by C; and (2) uses scores from S to weight the confidence scores of C to refine predictions. Our method consistently improves fine-grained image classification accuracy on CUB-200, Cars-196, and Dogs-120. Also, a human study finds that showing users our probable-class nearest neighbors (PCNN) reduces over-reliance on AI, thus improving their decision accuracy over prior work which only shows only the most-probable (top-1) class examples.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| fine-grained-image-classification-on-cub-200 | ResNet-50 | Accuracy: 88.59% |
| fine-grained-image-classification-on-cub-200-1 | ResNet-50 | Accuracy: 88.59 |
| fine-grained-image-classification-on-stanford | ResNet-50 | Accuracy: 91.06% |
| fine-grained-image-classification-on-stanford-1 | ResNet-50 | Accuracy: 86.31% |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.