HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

Arun Mallya; Svetlana Lazebnik

PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

Abstract

This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies in large deep networks to free up parameters that can then be employed to learn new tasks. By performing iterative pruning and network re-training, we are able to sequentially "pack" multiple tasks into a single network while ensuring minimal drop in performance and minimal storage overhead. Unlike prior work that uses proxy losses to maintain accuracy on older tasks, we always optimize for the task at hand. We perform extensive experiments on a variety of network architectures and large-scale datasets, and observe much better robustness against catastrophic forgetting than prior work. In particular, we are able to add three fine-grained classification tasks to a single ImageNet-trained VGG-16 network and achieve accuracies close to those of separately trained networks for each task. Code available at https://github.com/arunmallya/packnet

Code Repositories

Lucasc-99/PackNet-Continual-Learning
pytorch
Mentioned in GitHub
arunmallya/packnet
Official
pytorch
Mentioned in GitHub
Lucasc-99/packnet_cl
pytorch
Mentioned in GitHub
kamsyn95/CL_DNN
pytorch
Mentioned in GitHub

Benchmarks

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning | Papers | HyperAI