HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Efficiently Modeling Long Sequences with Structured State Spaces

Albert Gu; Karan Goel; Christopher Ré

Efficiently Modeling Long Sequences with Structured State Spaces

Abstract

A central goal of sequence modeling is designing a single principled model that can address sequence data across a range of modalities and tasks, particularly on long-range dependencies. Although conventional models including RNNs, CNNs, and Transformers have specialized variants for capturing long dependencies, they still struggle to scale to very long sequences of $10000$ or more steps. A promising recent approach proposed modeling sequences by simulating the fundamental state space model (SSM) ( x'(t) = Ax(t) + Bu(t), y(t) = Cx(t) + Du(t) ), and showed that for appropriate choices of the state matrix ( A ), this system could handle long-range dependencies mathematically and empirically. However, this method has prohibitive computation and memory requirements, rendering it infeasible as a general sequence modeling solution. We propose the Structured State Space sequence model (S4) based on a new parameterization for the SSM, and show that it can be computed much more efficiently than prior approaches while preserving their theoretical strengths. Our technique involves conditioning ( A ) with a low-rank correction, allowing it to be diagonalized stably and reducing the SSM to the well-studied computation of a Cauchy kernel. S4 achieves strong empirical results across a diverse range of established benchmarks, including (i) 91\% accuracy on sequential CIFAR-10 with no data augmentation or auxiliary losses, on par with a larger 2-D ResNet, (ii) substantially closing the gap to Transformers on image and language modeling tasks, while performing generation $60\times$ faster (iii) SoTA on every task from the Long Range Arena benchmark, including solving the challenging Path-X task of length 16k that all prior work fails on, while being as efficient as all competitors.

Code Repositories

state-spaces/s4
Official
pytorch
Mentioned in GitHub
forgi86/lru-reduction
jax
Mentioned in GitHub
ag1988/dss
pytorch
Mentioned in GitHub
maxtimer97/ssm-inspired-lif
pytorch
Mentioned in GitHub
leonty1/essm
pytorch
Mentioned in GitHub
elgazzarr/fmri-s4
pytorch
Mentioned in GitHub
nicolaszucchet/minimal-lru
jax
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
language-modelling-on-wikitext-103S4
Number of params: 249M
Test perplexity: 21.28
sequential-image-classification-on-sequentialS4
Permuted Accuracy: 98.70%
Unpermuted Accuracy: 99.63%
sequential-image-classification-on-sequential-1S4
Unpermuted Accuracy: 91.80%
speech-recognition-on-speech-commands-2S4
Accuracy (%): 98.32

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Efficiently Modeling Long Sequences with Structured State Spaces | Papers | HyperAI