Command Palette
Search for a command to run...
Iterative Neural Autoregressive Distribution Estimator NADE-k
{Kyunghyun Cho Yao Li Yoshua Bengio Tapani Raiko}

Abstract
Training of the neural autoregressive density estimator (NADE) can be viewed as doing one step of probabilistic inference on missing values in data. We propose a new model that extends this inference scheme to multiple steps, arguing that it is easier to learn to improve a reconstruction in $k$ steps rather than to learn to reconstruct in a single inference step. The proposed model is an unsupervised building block for deep learning that combines the desirable properties of NADE and multi-predictive training: (1) Its test likelihood can be computed analytically, (2) it is easy to generate independent samples from it, and (3) it uses an inference engine that is a superset of variational inference for Boltzmann machines. The proposed NADE-k is competitive with the state-of-the-art in density estimation on the two datasets tested.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| image-generation-on-binarized-mnist | EoNADE 2hl (128 orders) | nats: 85.10 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.