HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Invertible Residual Networks

Jens Behrmann; Will Grathwohl; Ricky T. Q. Chen; David Duvenaud; Jörn-Henrik Jacobsen

Invertible Residual Networks

Abstract

We show that standard ResNet architectures can be made invertible, allowing the same model to be used for classification, density estimation, and generation. Typically, enforcing invertibility requires partitioning dimensions or restricting network architectures. In contrast, our approach only requires adding a simple normalization step during training, already available in standard frameworks. Invertible ResNets define a generative model which can be trained by maximum likelihood on unlabeled data. To compute likelihoods, we introduce a tractable approximation to the Jacobian log-determinant of a residual block. Our empirical evaluation shows that invertible ResNets perform competitively with both state-of-the-art image classifiers and flow-based generative models, something that has not been previously achieved with a single architecture.

Code Repositories

rtqichen/residual-flows
pytorch
Mentioned in GitHub
eyalbetzalel/residual-flows
pytorch
Mentioned in GitHub
yperugachidiaz/invertible_densenets
pytorch
Mentioned in GitHub
RuqiBai/mixture_flow
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
image-generation-on-mnisti-ResNet
bits/dimension: 1.06

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Invertible Residual Networks | Papers | HyperAI