Command Palette
Search for a command to run...
Tsendsuren Munkhdalai; Xingdi Yuan; Soroush Mehri; Adam Trischler

Abstract
We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons. We apply this mechanism in the framework of metalearning, where the aim is to replicate some of the flexibility of human learning in machines. Conditionally shifted neurons modify their activation values with task-specific shifts retrieved from a memory module, which is populated rapidly based on limited task experience. On metalearning benchmarks from the vision and language domains, models augmented with conditionally shifted neurons achieve state-of-the-art results.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| few-shot-image-classification-on-mini-2 | adaResNet (DF) | Accuracy: 56.88 |
| few-shot-image-classification-on-mini-3 | adaResNet (DF) | Accuracy: 71.94 |
| few-shot-image-classification-on-omniglot-1-1 | adaCNN (DF) | Accuracy: 96.12% |
| few-shot-image-classification-on-omniglot-1-2 | adaCNN (DF) | Accuracy: 98.42 |
| few-shot-image-classification-on-omniglot-5-1 | adaCNN (DF) | Accuracy: 98.43% |
| few-shot-image-classification-on-omniglot-5-2 | adaCNN (DF) | Accuracy: 99.37 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.