HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Hierarchical Transformers Are More Efficient Language Models

Piotr Nawrot Szymon Tworkowski Michał Tyrolski Łukasz Kaiser Yuhuai Wu Christian Szegedy Henryk Michalewski

Hierarchical Transformers Are More Efficient Language Models

Abstract

Transformer models yield impressive results on many NLP and sequence modeling tasks. Remarkably, Transformers can handle long sequences which allows them to produce long coherent outputs: full paragraphs produced by GPT-3 or well-structured images produced by DALL-E. These large language models are impressive but also very inefficient and costly, which limits their applications and accessibility. We postulate that having an explicit hierarchical architecture is the key to Transformers that efficiently handle long sequences. To verify this claim, we first study different ways to downsample and upsample activations in Transformers so as to make them hierarchical. We use the best performing upsampling and downsampling layers to create Hourglass - a hierarchical Transformer language model. Hourglass improves upon the Transformer baseline given the same amount of computation and can yield the same results as Transformers more efficiently. In particular, Hourglass sets new state-of-the-art for Transformer models on the ImageNet32 generation task and improves language modeling efficiency on the widely studied enwik8 benchmark.

Benchmarks

BenchmarkMethodologyMetrics
image-generation-on-imagenet-32x32Hourglass
bpd: 3.74
image-generation-on-imagenet-64x64Hourglass
Bits per dim: 3.44
language-modelling-on-enwiki8Hourglass
Bit per Character (BPC): 0.997

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp