HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Continual Learning of Context-dependent Processing in Neural Networks

Guanxiong Zeng; Yang Chen; Bo Cui; Shan Yu

Continual Learning of Context-dependent Processing in Neural Networks

Abstract

Deep neural networks (DNNs) are powerful tools in learning sophisticated but fixed mapping rules between inputs and outputs, thereby limiting their application in more complex and dynamic situations in which the mapping rules are not kept the same but changing according to different contexts. To lift such limits, we developed a novel approach involving a learning algorithm, called orthogonal weights modification (OWM), with the addition of a context-dependent processing (CDP) module. We demonstrated that with OWM to overcome the problem of catastrophic forgetting, and the CDP module to learn how to reuse a feature representation and a classifier for different contexts, a single network can acquire numerous context-dependent mapping rules in an online and continual manner, with as few as $\sim$10 samples to learn each. This should enable highly compact systems to gradually learn myriad regularities of the real world and eventually behave appropriately within it.

Code Repositories

beijixiong3510/OWM
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
continual-learning-on-asc-19-tasksOWM
F1 - macro: 0.7931

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Continual Learning of Context-dependent Processing in Neural Networks | Papers | HyperAI