HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Center Contrastive Loss for Metric Learning

Bolun Cai Pengfei Xiong Shangxuan Tian

Center Contrastive Loss for Metric Learning

Abstract

Contrastive learning is a major studied topic in metric learning. However, sampling effective contrastive pairs remains a challenge due to factors such as limited batch size, imbalanced data distribution, and the risk of overfitting. In this paper, we propose a novel metric learning function called Center Contrastive Loss, which maintains a class-wise center bank and compares the category centers with the query data points using a contrastive loss. The center bank is updated in real-time to boost model convergence without the need for well-designed sample mining. The category centers are well-optimized classification proxies to re-balance the supervisory signal of each class. Furthermore, the proposed loss combines the advantages of both contrastive and classification methods by reducing intra-class variations and enhancing inter-class differences to improve the discriminative power of embeddings. Our experimental results, as shown in Figure 1, demonstrate that a standard network (ResNet50) trained with our loss achieves state-of-the-art performance and faster convergence.

Benchmarks

BenchmarkMethodologyMetrics
metric-learning-on-cars196CCL (ResNet-50)
R@1: 91.02
metric-learning-on-cub-200-2011CCL (ResNet-50)
R@1: 73.45
metric-learning-on-in-shop-1CCL (ResNet-50)
R@1: 92.31
metric-learning-on-stanford-online-products-1CCL (ResNet-50)
R@1: 83.10

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Center Contrastive Loss for Metric Learning | Papers | HyperAI