Create nodes & layers, forward & backward prop, and Stochastic Gradient Descent. Much of the starter code is borrowed from Udacity's Deep Learning Foundations Nanodegree.
- miniflow.py - Implementation of the input/linear/sigmoid activation nodes/layers, forward/back prop, stochastic gradient descent.
- nn.py - Creates a layer and tests that stochastic gradient descent works as expected (with gradients calculated via correct back prop)