an implementation of a generic feed forward ANN framework (for tutorial purposes)
This is a jupyter notebook showing a vanilla implementation of a general ANN framework. It allows to create arbitrary feed forward neural networks. This implementation relies only on numpy. It uses automatic differentiation and can be extended to support new activation and cost functions.
The scope of this implementation is not efficiency (nor numerical stability), but it is meant to show what happens behind the curtains" of more advanced implementations.