Theano implementation of dropout. See: http://arxiv.org/abs/1207.0580 Run with: ./mlp.py dropout for dropout, or ./mlp.py backprop for regular backprop with no dropout. Use: ./plot_results.sh results.png to visualize the results. Based on code from: - http://deeplearning.net/tutorial/mlp.html - http://deeplearning.net/tutorial/logreg.html Use the data here to make the units of the results comparable to Hinton's paper: - http://www.cs.ubc.ca/~mdenil/hidden/mnist_batches.npz