HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Scaling Vision with Sparse Mixture of Experts

Carlos Riquelme Joan Puigcerver Basil Mustafa Maxim Neumann Rodolphe Jenatton André Susano Pinto Daniel Keysers Neil Houlsby

Scaling Vision with Sparse Mixture of Experts

Abstract

Sparsely-gated Mixture of Experts networks (MoEs) have demonstrated excellent scalability in Natural Language Processing. In Computer Vision, however, almost all performant networks are "dense", that is, every input is processed by every parameter. We present a Vision MoE (V-MoE), a sparse version of the Vision Transformer, that is scalable and competitive with the largest dense networks. When applied to image recognition, V-MoE matches the performance of state-of-the-art networks, while requiring as little as half of the compute at inference time. Further, we propose an extension to the routing algorithm that can prioritize subsets of each input across the entire batch, leading to adaptive per-image compute. This allows V-MoE to trade-off performance and compute smoothly at test-time. Finally, we demonstrate the potential of V-MoE to scale vision models, and train a 15B parameter model that attains 90.35% on ImageNet.

Code Repositories

google-research/vmoe
Official
jax
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
few-shot-image-classification-on-imagenet-1-1VIT-H/14
Top 1 Accuracy: 62.34
few-shot-image-classification-on-imagenet-1-1ViT-MoE-15B (Every-2)
Top 1 Accuracy: 68.66
few-shot-image-classification-on-imagenet-1-1V-MoE-L/16 (Every-2)
Top 1 Accuracy: 62.41
few-shot-image-classification-on-imagenet-1-1V-MoE-H/14 (Last-5)
Top 1 Accuracy: 62.95
few-shot-image-classification-on-imagenet-1-1V-MoE-H/14 (Every-2)
Top 1 Accuracy: 63.38
few-shot-image-classification-on-imagenet-10ViT-MoE-15B (Every-2)
Top 1 Accuracy: 84.29
few-shot-image-classification-on-imagenet-10V-MoE-H/14 (Last-5)
Top 1 Accuracy: 80.1
few-shot-image-classification-on-imagenet-10V-MoE-H/14 (Every-2)
Top 1 Accuracy: 80.33
few-shot-image-classification-on-imagenet-10VIT-H/14
Top 1 Accuracy: 79.01
few-shot-image-classification-on-imagenet-5V-MoE-H/14 (Every-2)
Top 1 Accuracy: 78.21
few-shot-image-classification-on-imagenet-5ViT-MoE-15B (Every-2)
Top 1 Accuracy: 82.78
few-shot-image-classification-on-imagenet-5V-MoE-L/16 (Every-2)
Top 1 Accuracy: 77.1
few-shot-image-classification-on-imagenet-5V-MoE-H/14 (Last-5)
Top 1 Accuracy: 78.08
few-shot-image-classification-on-imagenet-5VIT-H/14
Top 1 Accuracy: 76.95
image-classification-on-imagenetV-MoE-H/14 (Every-2)
Number of params: 7200M
Top 1 Accuracy: 88.36%
image-classification-on-imagenetVIT-H/14
Number of params: 656M
Top 1 Accuracy: 88.08%
image-classification-on-imagenetV-MoE-L/16 (Every-2)
Number of params: 3400M
Top 1 Accuracy: 87.41%
image-classification-on-jft-300mVIT-H/14
prec@1: 56.68
image-classification-on-jft-300mV-MoE-H/14 (Every-2)
prec@1: 60.62
image-classification-on-jft-300mV-MoE-L/16 (Every-2)
prec@1: 57.65
image-classification-on-jft-300mV-MoE-H/14 (Last-5)
prec@1: 60.12

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Scaling Vision with Sparse Mixture of Experts | Papers | HyperAI