HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Train Sparsely, Generate Densely: Memory-efficient Unsupervised Training of High-resolution Temporal GAN

Masaki Saito; Shunta Saito; Masanori Koyama; Sosuke Kobayashi

Train Sparsely, Generate Densely: Memory-efficient Unsupervised Training of High-resolution Temporal GAN

Abstract

Training of Generative Adversarial Network (GAN) on a video dataset is a challenge because of the sheer size of the dataset and the complexity of each observation. In general, the computational cost of training GAN scales exponentially with the resolution. In this study, we present a novel memory efficient method of unsupervised learning of high-resolution video dataset whose computational cost scales only linearly with the resolution. We achieve this by designing the generator model as a stack of small sub-generators and training the model in a specific way. We train each sub-generator with its own specific discriminator. At the time of the training, we introduce between each pair of consecutive sub-generators an auxiliary subsampling layer that reduces the frame-rate by a certain ratio. This procedure can allow each sub-generator to learn the distribution of the video at different levels of resolution. We also need only a few GPUs to train a highly complex generator that far outperforms the predecessor in terms of inception scores.

Code Repositories

pfnet-research/tgan2
Official
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
video-generation-on-ucf-101-16-framesTGANv2
Inception Score: 21.45
video-generation-on-ucf-101-16-frames-128x128TGANv2
Inception Score: 24.34
video-generation-on-ucf-101-16-frames-128x128TGANv2 (2020)
Inception Score: 28.87

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Train Sparsely, Generate Densely: Memory-efficient Unsupervised Training of High-resolution Temporal GAN | Papers | HyperAI