Command Palette
Search for a command to run...
Antti Rasmus; Harri Valpola; Mikko Honkala; Mathias Berglund; Tapani Raiko

Abstract
We combine supervised learning with unsupervised learning in deep neural networks. The proposed model is trained to simultaneously minimize the sum of supervised and unsupervised cost functions by backpropagation, avoiding the need for layer-wise pre-training. Our work builds on the Ladder network proposed by Valpola (2015), which we extend by combining the model with supervision. We show that the resulting model reaches state-of-the-art performance in semi-supervised MNIST and CIFAR-10 classification, in addition to permutation-invariant MNIST classification with all labels.
Code Repositories
brandonrobertz/AcademicUrlTitles
Mentioned in GitHub
CuriousAI/ladder
Mentioned in GitHub
jubueche/Convolutional-LadderNet
tf
Mentioned in GitHub
Pongpisit-Thanasutives/Multi-task-Physics-informed-neural-networks
pytorch
Mentioned in GitHub
arasmus/ladder
Official
Mentioned in GitHub
udibr/LRE
Mentioned in GitHub
AbhinavS99/Ladder-Networks-for-Sign-Languages
tf
Mentioned in GitHub
DevD1092/face_emotion_recog
tf
Mentioned in GitHub
NaturalHistoryMuseum/semantic-segmentation
pytorch
Mentioned in GitHub
divamgupta/ladder_network_keras
tf
Mentioned in GitHub
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| semi-supervised-image-classification-on-cifar | Γ-model | Percentage error: 20.4 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.
AI Co-coding
Ready-to-use GPUs
Best Pricing
Hyper Newsletters
Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp