Command Palette
Search for a command to run...
Bolun Cai Pengfei Xiong Shangxuan Tian

Abstract
Contrastive learning is a major studied topic in metric learning. However, sampling effective contrastive pairs remains a challenge due to factors such as limited batch size, imbalanced data distribution, and the risk of overfitting. In this paper, we propose a novel metric learning function called Center Contrastive Loss, which maintains a class-wise center bank and compares the category centers with the query data points using a contrastive loss. The center bank is updated in real-time to boost model convergence without the need for well-designed sample mining. The category centers are well-optimized classification proxies to re-balance the supervisory signal of each class. Furthermore, the proposed loss combines the advantages of both contrastive and classification methods by reducing intra-class variations and enhancing inter-class differences to improve the discriminative power of embeddings. Our experimental results, as shown in Figure 1, demonstrate that a standard network (ResNet50) trained with our loss achieves state-of-the-art performance and faster convergence.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| metric-learning-on-cars196 | CCL (ResNet-50) | R@1: 91.02 |
| metric-learning-on-cub-200-2011 | CCL (ResNet-50) | R@1: 73.45 |
| metric-learning-on-in-shop-1 | CCL (ResNet-50) | R@1: 92.31 |
| metric-learning-on-stanford-online-products-1 | CCL (ResNet-50) | R@1: 83.10 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.