HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN

Shuai Li; Wanqing Li; Chris Cook; Ce Zhu; Yanbo Gao

Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN

Abstract

Recurrent neural networks (RNNs) have been widely used for processing sequential data. However, RNNs are commonly difficult to train due to the well-known gradient vanishing and exploding problems and hard to learn long-term patterns. Long short-term memory (LSTM) and gated recurrent unit (GRU) were developed to address these problems, but the use of hyperbolic tangent and the sigmoid action functions results in gradient decay over layers. Consequently, construction of an efficiently trainable deep network is challenging. In addition, all the neurons in an RNN layer are entangled together and their behaviour is hard to interpret. To address these problems, a new type of RNN, referred to as independently recurrent neural network (IndRNN), is proposed in this paper, where neurons in the same layer are independent of each other and they are connected across layers. We have shown that an IndRNN can be easily regulated to prevent the gradient exploding and vanishing problems while allowing the network to learn long-term dependencies. Moreover, an IndRNN can work with non-saturated activation functions such as relu (rectified linear unit) and be still trained robustly. Multiple IndRNNs can be stacked to construct a network that is deeper than the existing RNNs. Experimental results have shown that the proposed IndRNN is able to process very long sequences (over 5000 time steps), can be used to construct very deep networks (21 layers used in the experiment) and still be trained robustly. Better performances have been achieved on various tasks by using IndRNNs compared with the traditional RNN and LSTM. The code is available at https://github.com/Sunnydreamrain/IndRNN_Theano_Lasagne.

Code Repositories

secretlyvogon/IndRNNTF
tf
Mentioned in GitHub
TobiasLee/Text-Classification
tf
Mentioned in GitHub
trevor-richardson/rnn_zoo
pytorch
Mentioned in GitHub
Sunnydreamrain/IndRNN_pytorch
pytorch
Mentioned in GitHub
StefOe/indrnn-pytorch
pytorch
Mentioned in GitHub
lmnt-com/haste
tf
Mentioned in GitHub
Sunnydreamrain/IndRNN_Theano_Lasagne
Official
pytorch
Mentioned in GitHub
Sunnydreamrain/IndRNN
tf
Mentioned in GitHub
batzner/indrnn
tf
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
language-modelling-on-penn-treebank-characterIndRNN
Bit per Character (BPC): 1.19
sequential-image-classification-on-sequentialIndRNN
Permuted Accuracy: 96%
Unpermuted Accuracy: 99%
skeleton-based-action-recognition-on-ntu-rgbdInd-RNN
Accuracy (CS): 81.8
Accuracy (CV): 88.0

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN | Papers | HyperAI