/RNN-Language-Model

Character sequence prediction using RNN models

Primary LanguagePython

RNN Language Model

Character sequence prediction using simple RNN (Recurrent Neural Networks) or LSTM (Long Short-Term Memory).

Why LSTM ?

Standard RNNs suffer from vanishing and exploding gradient problems. LSTM deal with these problems by using feedback connections and introducing new gates, such as input and forget gates, which allow for a better control over the gradient flow and enable better preservation of long-range dependencies.