HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Contrastive Learning and Self-Training for Unsupervised Domain Adaptation in Semantic Segmentation

Robert A. Marsden Alexander Bartler Mario Döbler Bin Yang

Contrastive Learning and Self-Training for Unsupervised Domain Adaptation in Semantic Segmentation

Abstract

Deep convolutional neural networks have considerably improved state-of-the-art results for semantic segmentation. Nevertheless, even modern architectures lack the ability to generalize well to a test dataset that originates from a different domain. To avoid the costly annotation of training data for unseen domains, unsupervised domain adaptation (UDA) attempts to provide efficient knowledge transfer from a labeled source domain to an unlabeled target domain. Previous work has mainly focused on minimizing the discrepancy between the two domains by using adversarial training or self-training. While adversarial training may fail to align the correct semantic categories as it minimizes the discrepancy between the global distributions, self-training raises the question of how to provide reliable pseudo-labels. To align the correct semantic categories across domains, we propose a contrastive learning approach that adapts category-wise centroids across domains. Furthermore, we extend our method with self-training, where we use a memory-efficient temporal ensemble to generate consistent and reliable pseudo-labels. Although both contrastive learning and self-training (CLST) through temporal ensembling enable knowledge transfer between two domains, it is their combination that leads to a symbiotic structure. We validate our approach on two domain adaptation benchmarks: GTA5 $\rightarrow$ Cityscapes and SYNTHIA $\rightarrow$ Cityscapes. Our method achieves better or comparable results than the state-of-the-art. We will make the code publicly available.

Benchmarks

BenchmarkMethodologyMetrics
synthetic-to-real-translation-on-gtav-toCLST
mIoU: 51.6
synthetic-to-real-translation-on-synthia-to-1CLST(ResNet-101)
MIoU (13 classes): 57.8
MIoU (16 classes): 49.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Contrastive Learning and Self-Training for Unsupervised Domain Adaptation in Semantic Segmentation | Papers | HyperAI