HyperAIHyperAI

Command Palette

Search for a command to run...

Bi-directional Long-Short Term Memory/Bi-LSTM

Date

6 years ago

definition

Deep neural networks have demonstrated superior results in many areas such as speech recognition, image processing, and natural language processing. LSTM, as a variant of RNN, can learn long-term dependencies in data compared to RNN.

In 2005, Graves proposed combining LSTM and BRNN to form BLSTM. Compared with BRNN, BLSTM can better deal with the problems of gradient disappearance and explosion. In 2013, Graves proposed deep BLSTM, which can better extract and represent features, and the effect is also superior to BLSTM.

Development Analysis

bottleneck

Unlike CNN, time-series networks such as LSTM are difficult to parallelize and therefore difficult to accelerate using GPUs. In addition, RNNs and LSTMs with their recurrent recognition and embedded memory nodes will be used less and less and will no longer be competitive with CNN-based solutions, as the performance of parallel architectures outperforms sequential architectures.

Future Development Direction

BLSTM has many development directions:

  • The input and output gates of LSTM and its variant BLSTM will likely be replaced by auxiliary differentiable memories;
  • More complex neural networks, such as a combination of various neural networks (BLSTM+CNN).

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Bi-directional Long-Short Term Memory/Bi-LSTM | Wiki | HyperAI