HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Gating Revisited: Deep Multi-layer RNNs That Can Be Trained

Mehmet Ozgur Turkoglu; Stefano D'Aronco; Jan Dirk Wegner; Konrad Schindler

Gating Revisited: Deep Multi-layer RNNs That Can Be Trained

Abstract

We propose a new STAckable Recurrent cell (STAR) for recurrent neural networks (RNNs), which has fewer parameters than widely used LSTM and GRU while being more robust against vanishing or exploding gradients. Stacking recurrent units into deep architectures suffers from two major limitations: (i) many recurrent cells (e.g., LSTMs) are costly in terms of parameters and computation resources; and (ii) deep RNNs are prone to vanishing or exploding gradients during training. We investigate the training of multi-layer RNNs and examine the magnitude of the gradients as they propagate through the network in the "vertical" direction. We show that, depending on the structure of the basic recurrent unit, the gradients are systematically attenuated or amplified. Based on our analysis we design a new type of gated cell that better preserves gradient magnitude. We validate our design on a large number of sequence modelling tasks and demonstrate that the proposed STAR cell allows to build and train deeper recurrent architectures, ultimately leading to improved performance while being computationally more efficient.

Code Repositories

0zgur0/STAR_Network
Official
tf
Mentioned in GitHub
titu1994/tf_star_rnn
tf
Mentioned in GitHub
croros/STAR_Network_Pytorch
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
action-recognition-in-videos-on-jester-1convSTAR
Val: 92.7
language-modelling-on-penn-treebank-characterSTAR
Bit per Character (BPC): 1.30
sequential-image-classification-on-sequentialSTAR
Unpermuted Accuracy: 99.4%

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Gating Revisited: Deep Multi-layer RNNs That Can Be Trained | Papers | HyperAI