HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Prompt-based Distribution Alignment for Unsupervised Domain Adaptation

Shuanghao Bai Min Zhang Wanqi Zhou Siteng Huang Zhirong Luan Donglin Wang Badong Chen

Prompt-based Distribution Alignment for Unsupervised Domain Adaptation

Abstract

Recently, despite the unprecedented success of large pre-trained visual-language models (VLMs) on a wide range of downstream tasks, the real-world unsupervised domain adaptation (UDA) problem is still not well explored. Therefore, in this paper, we first experimentally demonstrate that the unsupervised-trained VLMs can significantly reduce the distribution discrepancy between source and target domains, thereby improving the performance of UDA. However, a major challenge for directly deploying such models on downstream UDA tasks is prompt engineering, which requires aligning the domain knowledge of source and target domains, since the performance of UDA is severely influenced by a good domain-invariant representation. We further propose a Prompt-based Distribution Alignment (PDA) method to incorporate the domain knowledge into prompt learning. Specifically, PDA employs a two-branch prompt-tuning paradigm, namely base branch and alignment branch. The base branch focuses on integrating class-related representation into prompts, ensuring discrimination among different classes. To further minimize domain discrepancy, for the alignment branch, we construct feature banks for both the source and target domains and propose image-guided feature tuning (IFT) to make the input attend to feature banks, which effectively integrates self-enhanced and cross-domain features into the model. In this way, these two branches can be mutually promoted to enhance the adaptation of VLMs for UDA. We conduct extensive experiments on three benchmarks to demonstrate that our proposed PDA achieves state-of-the-art performance. The code is available at https://github.com/BaiShuanghao/Prompt-based-Distribution-Alignment.

Code Repositories

baishuanghao/prompt-based-distribution-alignment
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
unsupervised-domain-adaptation-on-office-31PDA (CLIP, ViT-B/16)
Accuracy: 91.2
unsupervised-domain-adaptation-on-office-homePDA (CLIP, ResNet-50)
Accuracy: 75.3
unsupervised-domain-adaptation-on-office-homePDA (CLIP, ViT-B/16)
Accuracy: 85.7
unsupervised-domain-adaptation-on-visda2017PDA (CLIP, ViT-B/16)
Accuracy: 89.7
unsupervised-domain-adaptation-on-visda2017PDA (CLIP, ResNet-101)
Accuracy: 86.4

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Prompt-based Distribution Alignment for Unsupervised Domain Adaptation | Papers | HyperAI