HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Sampling Matters in Deep Embedding Learning

Chao-Yuan Wu; R. Manmatha; Alexander J. Smola; Philipp Krähenbühl

Sampling Matters in Deep Embedding Learning

Abstract

Deep embeddings answer one simple question: How similar are two images? Learning these embeddings is the bedrock of verification, zero-shot learning, and visual search. The most prominent approaches optimize a deep convolutional network with a suitable loss function, such as contrastive loss or triplet loss. While a rich line of work focuses solely on the loss functions, we show in this paper that selecting training examples plays an equally important role. We propose distance weighted sampling, which selects more informative and stable examples than traditional approaches. In addition, we show that a simple margin based loss is sufficient to outperform all other loss functions. We evaluate our approach on the Stanford Online Products, CAR196, and the CUB200-2011 datasets for image retrieval and clustering, and on the LFW dataset for face verification. Our method achieves state-of-the-art performance on all of them.

Benchmarks

BenchmarkMethodologyMetrics
image-retrieval-on-cars196Margin
R@1: 86.9
metric-learning-on-cars196ResNet-50 + Margin
R@1: 79.6
metric-learning-on-cub-200-2011ResNet-50 + Margin
R@1: 63.6

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Sampling Matters in Deep Embedding Learning | Papers | HyperAI