HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

AC/DC: Alternating Compressed/DeCompressed Training of Deep Neural Networks

Alexandra Peste Eugenia Iofinova Adrian Vladu Dan Alistarh

AC/DC: Alternating Compressed/DeCompressed Training of Deep Neural Networks

Abstract

The increasing computational requirements of deep neural networks (DNNs) have led to significant interest in obtaining DNN models that are sparse, yet accurate. Recent work has investigated the even harder case of sparse training, where the DNN weights are, for as much as possible, already sparse to reduce computational costs during training. Existing sparse training methods are often empirical and can have lower accuracy relative to the dense baseline. In this paper, we present a general approach called Alternating Compressed/DeCompressed (AC/DC) training of DNNs, demonstrate convergence for a variant of the algorithm, and show that AC/DC outperforms existing sparse training methods in accuracy at similar computational budgets; at high sparsity levels, AC/DC even outperforms existing methods that rely on accurate pre-trained dense models. An important property of AC/DC is that it allows co-training of dense and sparse models, yielding accurate sparse-dense model pairs at the end of the training process. This is useful in practice, where compressed variants may be desirable for deployment in resource-constrained settings without re-doing the entire training flow, and also provides us with insights into the accuracy gap between dense and compressed models. The code is available at: https://github.com/IST-DASLab/ACDC .

Code Repositories

IST-DASLab/ACDC
Official
pytorch
Mentioned in GitHub
IST-DASLab/sparseprop
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
network-pruning-on-cifar-100AC/DC
Accuracy: 78.2
network-pruning-on-cifar-100Dense
Accuracy: 79
network-pruning-on-imagenetResNet50
Accuracy: 73.14
network-pruning-on-imagenet-resnet-50-90AC/DC
Top-1 Accuracy: 75.64

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
AC/DC: Alternating Compressed/DeCompressed Training of Deep Neural Networks | Papers | HyperAI