HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

The GAN is dead; long live the GAN! A Modern GAN Baseline

Yiwen Huang Aaron Gokaslan Volodymyr Kuleshov James Tompkin

The GAN is dead; long live the GAN! A Modern GAN Baseline

Abstract

There is a widely-spread claim that GANs are difficult to train, and GANarchitectures in the literature are littered with empirical tricks. We provideevidence against this claim and build a modern GAN baseline in a moreprincipled manner. First, we derive a well-behaved regularized relativistic GANloss that addresses issues of mode dropping and non-convergence that werepreviously tackled via a bag of ad-hoc tricks. We analyze our lossmathematically and prove that it admits local convergence guarantees, unlikemost existing relativistic losses. Second, our new loss allows us to discardall ad-hoc tricks and replace outdated backbones used in common GANs withmodern architectures. Using StyleGAN2 as an example, we present a roadmap ofsimplification and modernization that results in a new minimalist baseline --R3GAN. Despite being simple, our approach surpasses StyleGAN2 on FFHQ,ImageNet, CIFAR, and Stacked MNIST datasets, and compares favorably againststate-of-the-art GANs and diffusion models.

Code Repositories

brownvc/r3gan
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
image-generation-on-cifar-10R3GAN
FID: 1.96
image-generation-on-ffhq-256-x-256R3GAN
FID: 2.75
image-generation-on-imagenet-32x32R3GAN
FID: 1.27

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
The GAN is dead; long live the GAN! A Modern GAN Baseline | Papers | HyperAI