Skip navigation

Glossary

Long Short-Term Memory

Long Short-Term Memory

Long Short-Term Memory (LSTM) is a type of Artificial Neural Network (ANN) architecture that is widely used for tasks such as speech recognition, natural language processing, and time series prediction. This architecture enables the network to learn long-term dependencies by keeping track of information over longer periods of time. LSTM networks are essentially composed of units called cells in order to manage the flow of information. The cells have memory blocks which contain gates that can be used to control the information flow, allowing the network to remember or forget information. These gates are what give the LSTM architecture its advantages over traditional Recurrent Neural Networks since they enable the network to learn from both short and long-term dependencies.

Long Short-Term Memory on ikipedia
Back