HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

DyTox: Transformers for Continual Learning with DYnamic TOken eXpansion

Arthur Douillard Alexandre Ramé Guillaume Couairon Matthieu Cord

DyTox: Transformers for Continual Learning with DYnamic TOken eXpansion

Abstract

Deep network architectures struggle to continually learn new tasks without forgetting the previous tasks. A recent trend indicates that dynamic architectures based on an expansion of the parameters can reduce catastrophic forgetting efficiently in continual learning. However, existing approaches often require a task identifier at test-time, need complex tuning to balance the growing number of parameters, and barely share any information across tasks. As a result, they struggle to scale to a large number of tasks without significant overhead. In this paper, we propose a transformer architecture based on a dedicated encoder/decoder framework. Critically, the encoder and decoder are shared among all tasks. Through a dynamic expansion of special tokens, we specialize each forward of our decoder network on a task distribution. Our strategy scales to a large number of tasks while having negligible memory and time overheads due to strict control of the parameters expansion. Moreover, this efficient strategy doesn't need any hyperparameter tuning to control the network's expansion. Our model reaches excellent results on CIFAR100 and state-of-the-art performances on the large-scale ImageNet100 and ImageNet1000 while having less parameters than concurrent dynamic frameworks.

Code Repositories

arthurdouillard/dytox
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
incremental-learning-on-imagenet-10-stepsDyTox
# M Params: 11.36
Average Incremental Accuracy: 71.29
Average Incremental Accuracy Top-5: 88.59
Final Accuracy: 63.34
Final Accuracy Top-5: 84.49
incremental-learning-on-imagenet100-10-stepsDyTox
# M Params: 11.01
Average Incremental Accuracy: 77.15
Average Incremental Accuracy Top-5: 92.04
Final Accuracy: 69.10
Final Accuracy Top-5: 87.98

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
DyTox: Transformers for Continual Learning with DYnamic TOken eXpansion | Papers | HyperAI