HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

N-Grammer: Augmenting Transformers with latent n-grams

N-Grammer: Augmenting Transformers with latent n-grams

Abstract

Transformer models have recently emerged as one of the foundational models in natural language processing, and as a byproduct, there is significant recent interest and investment in scaling these models. However, the training and inference costs of these large Transformer language models are prohibitive, thus necessitating more research in identifying more efficient variants. In this work, we propose a simple yet effective modification to the Transformer architecture inspired by the literature in statistical language modeling, by augmenting the model with n-grams that are constructed from a discrete latent representation of the text sequence. We evaluate our model, the N-Grammer on language modeling on the C4 data-set as well as text classification on the SuperGLUE data-set, and find that it outperforms several strong baselines such as the Transformer and the Primer. We open-source our model for reproducibility purposes in Jax.

Code Repositories

yiyixuxu/n-grammer-flax
jax
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
common-sense-reasoning-on-recordN-Grammer 343M
EM: 28.9
F1: 29.9
coreference-resolution-on-winograd-schemaN-Grammer 343M
Accuracy: 68.3
language-modelling-on-c4N-Grammer 343M
Perplexity: 14.79
language-modelling-on-c4N-Grammer 288M
Perplexity: 15.01
natural-language-inference-on-commitmentbankN-Grammer 343M
Accuracy: 67.9
F1: 59.7
natural-language-inference-on-rteN-Grammer 343M
Accuracy: 59.2%
question-answering-on-boolqN-Grammer 343M
Accuracy: 65
question-answering-on-copaN-Grammer 343M
Accuracy: 60.0
question-answering-on-multircN-Grammer 343M
EM: 11.3
F1: 62
word-sense-disambiguation-on-words-in-contextN-Grammer 343M
Accuracy: 56.1

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
N-Grammer: Augmenting Transformers with latent n-grams | Papers | HyperAI