HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Multiplicative LSTM for sequence modelling

Ben Krause; Liang Lu; Iain Murray; Steve Renals

Multiplicative LSTM for sequence modelling

Abstract

We introduce multiplicative LSTM (mLSTM), a recurrent neural network architecture for sequence modelling that combines the long short-term memory (LSTM) and multiplicative recurrent neural network architectures. mLSTM is characterised by its ability to have different recurrent transition functions for each possible input, which we argue makes it more expressive for autoregressive density estimation. We demonstrate empirically that mLSTM outperforms standard LSTM and its deep variants for a range of character level language modelling tasks. In this version of the paper, we regularise mLSTM to achieve 1.27 bits/char on text8 and 1.24 bits/char on Hutter Prize. We also apply a purely byte-level mLSTM on the WikiText-2 dataset to achieve a character level entropy of 1.26 bits/char, corresponding to a word level perplexity of 88.8, which is comparable to word level LSTMs regularised in similar ways on the same task.

Code Repositories

astakara48/python_project
tf
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
language-modelling-on-enwiki8Large mLSTM
Bit per Character (BPC): 1.24
Number of params: 46M
language-modelling-on-hutter-prizeLarge mLSTM +emb +WN +VD
Bit per Character (BPC): 1.24
Number of params: 46M
language-modelling-on-text8Large mLSTM +emb +WN +VD
Bit per Character (BPC): 1.27
Number of params: 45M
language-modelling-on-text8Unregularised mLSTM
Bit per Character (BPC): 1.40
Number of params: 45M

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Multiplicative LSTM for sequence modelling | Papers | HyperAI