Command Palette
Search for a command to run...
Vikram Voleti Chris Finlay Adam Oberman Christopher Pal

Abstract
Recent work has shown that Neural Ordinary Differential Equations (ODEs) can serve as generative models of images using the perspective of Continuous Normalizing Flows (CNFs). Such models offer exact likelihood calculation, and invertible generation/density estimation. In this work we introduce a Multi-Resolution variant of such models (MRCNF), by characterizing the conditional distribution over the additional information required to generate a fine image that is consistent with the coarse image. We introduce a transformation between resolutions that allows for no change in the log likelihood. We show that this approach yields comparable likelihood values for various image datasets, with improved performance at higher resolutions, with fewer parameters, using only 1 GPU. Further, we examine the out-of-distribution properties of (Multi-Resolution) Continuous Normalizing Flows, and find that they are similar to those of other likelihood-based generative models.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| density-estimation-on-cifar-10 | MRCNF | NLL (bits/dim): 3.54 |
| image-generation-on-imagenet-32x32 | MRCNF | bpd: 3.77 |
| image-generation-on-imagenet-64x64 | MRCNF | Bits per dim: 3.44 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.