HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Feature Quantization Improves GAN Training

Yang Zhao Chunyuan Li Ping Yu Jianfeng Gao Changyou Chen

Feature Quantization Improves GAN Training

Abstract

The instability in GAN training has been a long-standing problem despite remarkable research efforts. We identify that instability issues stem from difficulties of performing feature matching with mini-batch statistics, due to a fragile balance between the fixed target distribution and the progressively generated distribution. In this work, we propose Feature Quantization (FQ) for the discriminator, to embed both true and fake data samples into a shared discrete space. The quantized values of FQ are constructed as an evolving dictionary, which is consistent with feature statistics of the recent distribution history. Hence, FQ implicitly enables robust feature matching in a compact space. Our method can be easily plugged into existing GAN models, with little computational overhead in training. We apply FQ to 3 representative GAN models on 9 benchmarks: BigGAN for image generation, StyleGAN for face synthesis, and U-GAT-IT for unsupervised image-to-image translation. Extensive experimental results show that the proposed FQ-GAN can improve the FID scores of baseline methods by a large margin on a variety of tasks, achieving new state-of-the-art performance.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
conditional-image-generation-on-cifar-10FQ-GAN
FID: 5.34
Inception score: 8.50
conditional-image-generation-on-cifar-100FQ-GAN
FID: 7.15
Inception Score: 9.74
conditional-image-generation-on-imagenetFQ-GAN
FID: 13.77
Inception score: 54.36
conditional-image-generation-on-imagenet-1FQ-GAN
FID: 9.67
Inception score: 25.96
image-generation-on-ffhq-1024-x-1024FQ-GAN
FID: 3.19
image-to-image-translation-on-anime-to-selfieFQ-GAN
Kernel Inception Distance: 10.23
image-to-image-translation-on-selfie-to-animeFQ-GAN
Kernel Inception Distance: 11.40

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Feature Quantization Improves GAN Training | Papers | HyperAI