HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

HyperTransformer: Model Generation for Supervised and Semi-Supervised Few-Shot Learning

Andrey Zhmoginov Mark Sandler Max Vladymyrov

HyperTransformer: Model Generation for Supervised and Semi-Supervised Few-Shot Learning

Abstract

In this work we propose a HyperTransformer, a Transformer-based model for supervised and semi-supervised few-shot learning that generates weights of a convolutional neural network (CNN) directly from support samples. Since the dependence of a small generated CNN model on a specific task is encoded by a high-capacity Transformer model, we effectively decouple the complexity of the large task space from the complexity of individual tasks. Our method is particularly effective for small target CNN architectures where learning a fixed universal task-independent embedding is not optimal and better performance is attained when the information about the task can modulate all model parameters. For larger models we discover that generating the last layer alone allows us to produce competitive or better results than those obtained with state-of-the-art methods while being end-to-end differentiable.

Benchmarks

BenchmarkMethodologyMetrics
few-shot-image-classification-on-omniglot-1-1MAML++
Accuracy: 97.7
few-shot-image-classification-on-omniglot-5-1MAML++
Accuracy: 99.3%
few-shot-image-classification-on-tiered-1HyperTransformer
Accuracy: 73.9%
few-shot-image-classification-on-tiered-1RFS
Accuracy: 73.2%

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
HyperTransformer: Model Generation for Supervised and Semi-Supervised Few-Shot Learning | Papers | HyperAI