HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Longformer: The Long-Document Transformer

Iz Beltagy Matthew E. Peters Arman Cohan

Longformer: The Long-Document Transformer

Abstract

Transformer-based models are unable to process long sequences due to their self-attention operation, which scales quadratically with the sequence length. To address this limitation, we introduce the Longformer with an attention mechanism that scales linearly with sequence length, making it easy to process documents of thousands of tokens or longer. Longformer's attention mechanism is a drop-in replacement for the standard self-attention and combines a local windowed attention with a task motivated global attention. Following prior work on long-sequence transformers, we evaluate Longformer on character-level language modeling and achieve state-of-the-art results on text8 and enwik8. In contrast to most prior work, we also pretrain Longformer and finetune it on a variety of downstream tasks. Our pretrained Longformer consistently outperforms RoBERTa on long document tasks and sets new state-of-the-art results on WikiHop and TriviaQA. We finally introduce the Longformer-Encoder-Decoder (LED), a Longformer variant for supporting long document generative sequence-to-sequence tasks, and demonstrate its effectiveness on the arXiv summarization dataset.

Code Repositories

mim-solutions/bert_for_longer_texts
pytorch
Mentioned in GitHub
microsoft/dialoglm
pytorch
Mentioned in GitHub
lucashueda/long_sentence_transformer
pytorch
Mentioned in GitHub
mistralai/mistral-src
pytorch
Mentioned in GitHub
kit-mrt/red-motion
pytorch
Mentioned in GitHub
jaketae/pytorch-malware-detection
pytorch
Mentioned in GitHub
huggingface/transformers
pytorch
Mentioned in GitHub
allenai/longformer
Official
pytorch
Mentioned in GitHub
schenliu/longformer-chinese
pytorch
Mentioned in GitHub
han-shi/SparseBERT
pytorch
Mentioned in GitHub
naver-ai/simseek
pytorch
Mentioned in GitHub
kit-mrt/road-barlow-twins
pytorch
Mentioned in GitHub
Phrase-in-Context/eval
pytorch
Mentioned in GitHub
facebookresearch/xformers
pytorch
Mentioned in GitHub
a-rios/ats-models
pytorch
Mentioned in GitHub
mim-solutions/roberta_for_longer_texts
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
language-modelling-on-enwiki8Longformer (12 layers, h=512)
Bit per Character (BPC): 1.00
Number of params: 41M
language-modelling-on-enwiki8Longformer (30 layers, h=512)
Bit per Character (BPC): 0.99
Number of params: 102M
language-modelling-on-hutter-prizeLongformer Small
Bit per Character (BPC): 1.00
Number of params: 41M
language-modelling-on-hutter-prizeLongformer Large
Bit per Character (BPC): 0.99
Number of params: 102M
question-answering-on-wikihopLongformer-large
Test: 81.9

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Longformer: The Long-Document Transformer | Papers | HyperAI