HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Solving Inefficiency of Self-supervised Representation Learning

Guangrun Wang; Keze Wang; Guangcong Wang; Philip H.S. Torr; Liang Lin

Solving Inefficiency of Self-supervised Representation Learning

Abstract

Self-supervised learning (especially contrastive learning) has attracted great interest due to its huge potential in learning discriminative representations in an unsupervised manner. Despite the acknowledged successes, existing contrastive learning methods suffer from very low learning efficiency, e.g., taking about ten times more training epochs than supervised learning for comparable recognition accuracy. In this paper, we reveal two contradictory phenomena in contrastive learning that we call under-clustering and over-clustering problems, which are major obstacles to learning efficiency. Under-clustering means that the model cannot efficiently learn to discover the dissimilarity between inter-class samples when the negative sample pairs for contrastive learning are insufficient to differentiate all the actual object classes. Over-clustering implies that the model cannot efficiently learn features from excessive negative sample pairs, forcing the model to over-cluster samples of the same actual classes into different clusters. To simultaneously overcome these two problems, we propose a novel self-supervised learning framework using a truncated triplet loss. Precisely, we employ a triplet loss tending to maximize the relative distance between the positive pair and negative pairs to address the under-clustering problem; and we construct the negative pair by selecting a negative sample deputy from all negative samples to avoid the over-clustering problem, guaranteed by the Bernoulli Distribution model. We extensively evaluate our framework in several large-scale benchmarks (e.g., ImageNet, SYSU-30k, and COCO). The results demonstrate our model's superiority (e.g., the learning efficiency) over the latest state-of-the-art methods by a clear margin. Codes available at: https://github.com/wanggrun/triplet .

Code Repositories

wanggrun/triplet
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
person-re-identification-on-sysu-30kTriplet (self-supervised)
Rank-1: 14.8
self-supervised-image-classification-onTriplet (ResNet-50)
Number of Params: 23.56M
Top 1 Accuracy: 75.9%
self-supervised-person-re-identification-onTriplet
Rank-1: 14.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Solving Inefficiency of Self-supervised Representation Learning | Papers | HyperAI