Elman RNN
Simple implementation of a recurrent neural network with one context layer, with backpropagation calculation performed explicitly.
Usage
There are two main implementations of the Elman RNN: elman.py
and elman_opt.py
. The former is a basic implementation of the network, and the latter uses numba to compile the backpropagation calculation in C, giving about a 5x reduction in calculation time. Practically, both are completely interchangeable, and the relevant class and functions can be imported as:
from <elman | elman_opt> import ElmanNetwork, normalize, load, save
For more info, see the example notebooks, listed below.
Examples
Drawing a circle
Drawing a figure-eight
Drawing both a circle and figure-eight (multi-attractor)
References
Elman, Jeffrey L. “Distributed Representations, Simple Recurrent Networks, and Grammatical Structure.” Machine Learning 7, no. 2 (September 1, 1991): 195–225. https://doi.org/10.1007/BF00114844.