Command Palette
Search for a command to run...
Shaojie Bai; J. Zico Kolter; Vladlen Koltun

Abstract
We present a new approach to modeling sequential data: the deep equilibrium model (DEQ). Motivated by an observation that the hidden layers of many existing deep sequence models converge towards some fixed point, we propose the DEQ approach that directly finds these equilibrium points via root-finding. Such a method is equivalent to running an infinite depth (weight-tied) feedforward network, but has the notable advantage that we can analytically backpropagate through the equilibrium point using implicit differentiation. Using this approach, training and prediction in these networks require only constant memory, regardless of the effective "depth" of the network. We demonstrate how DEQs can be applied to two state-of-the-art deep sequence models: self-attention transformers and trellis networks. On large-scale language modeling tasks, such as the WikiText-103 benchmark, we show that DEQs 1) often improve performance over these state-of-the-art models (for similar parameter counts); 2) have similar computational requirements to existing models; and 3) vastly reduce memory consumption (often the bottleneck for training large sequence models), demonstrating an up-to 88% memory reduction in our experiments. The code is available at https://github.com/locuslab/deq .
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| language-modelling-on-penn-treebank-word | DEQ-TrellisNet | Params: 24M Test perplexity: 57.1 |
| language-modelling-on-wikitext-103 | DEQ-TrellisNet | Number of params: 180M Test perplexity: 29.0 |
| language-modelling-on-wikitext-103 | DEQ-Transformer (medium, adaptive embed) | Number of params: 110M Test perplexity: 23.2 |
| language-modelling-on-wikitext-103 | DEQ-Transformer (small) | Number of params: 138M Test perplexity: 32.4 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.