What is a Long Short-Term Memory?

A Long Short-Term Memory (LSTM) is a long short-term memory in artificial intelligence. Thus, in the training of Artificial neural networks a method of error signal descent is used.

This can be compared to the search of a mountain climber for the deepest valley. If there are several deepening layers, this will fall short if the forgetful mountaineer thus ends up in the first best valley on the descent and thus can no longer find his village in a deeper valley. With an LSTM method, however, this problem can be solved quite easily.

There are LSTM cells with three types of gates for better memory. With an input gate, a forget gate and an output gate, it is possible to LSTM equipped so that memories of previous experiences are possible. The short-term memory lasts for a long time and the principle behaviour of the network is encoded in the weights. Such neural networks, which are equipped with many layers, are really extremely capable of learning. Long Short-Term Memory can ensure that such multi-layered networks function excellently. With this technique, a breakthrough in artificial intelligence has been made possible.

What is the purpose of LSTM networks?

LSTM networks are particularly suitable for classifying, processing and making predictions based on time series data. LSTM is applicable to tasks such as handwritten text recognition and speech recognition. It can also be used for anomaly detection in network traffic or intrusion detection systems.

What are the benefits of Long Short-Term Memory?

LSTM is a methodology from the field of artificial intelligence. Neural networks work with LSTM and they describe a very special functional block of recurrent neural networks with a kind of "long short-term memory". The neural networks become much more powerful with LSTM.

With Long Short-Term Memory, a short-term memory is created that lasts for a long time and allows that Recurrent Neural Networks (RNN) are able to remember certain long-term dependencies and past experiences. Thus, every task and every problem should not be started from scratch. Knowledge already acquired should be used and RNNs can fall back on experiences made.

What makes Long Short-Term Memory so special?

LSTMs are simply evolutions of recurrent neural networks. What particularly distinguishes LSTM is its ability to select past information in certain sequential data. This makes it possible to learn from training data which information from the past can be used for the current output and which information can simply be forgotten.

Such a capability is the result of years of research and development. With the help of Backpropagation the artificial neural networks are trained and LSTM enables them to find their way. The use of LSTM is worthwhile for many companies because of its great potential for use.