Issues
- 1
- 1
Add more stochastic gradient descent methods
#2 opened by aromanro - 1
Add the possibility to 'skip' layers
#17 opened by aromanro - 1
- 1
Add ensembles example
#20 opened by aromanro - 3
Add batch normalization
#6 opened by aromanro - 0
Refactor code
#28 opened by aromanro - 0
Add more activation functions
#4 opened by aromanro - 0
Add reinforcement learning
#27 opened by aromanro - 1
Try categorical cross-entropy loss instead of the cross-entropy loss currently used
#26 opened by aromanro - 0
Add convolutions
#13 opened by aromanro - 0
Add more loss functions
#3 opened by aromanro - 0
Implement autodiff
#25 opened by aromanro - 2
For EMNIST dataset, do some affine transformations (except reflections) for data augmentation
#16 opened by aromanro - 0
Add softmax with 'temperature'
#24 opened by aromanro - 1
Add distillation
#21 opened by aromanro - 0
Add saving/loading for the neural net
#14 opened by aromanro - 0
Add autoencoders
#22 opened by aromanro - 0
- 0
- 0
Add recurrent neural network
#18 opened by aromanro - 2
Add dropout
#7 opened by aromanro - 0
Add overall accuracy (% correct) for Iris dataset as well, for softmax and neural network stats
#11 opened by aromanro - 0
Add layer normalization
#12 opened by aromanro - 0
- 0
Enhance initialisation of weights
#9 opened by aromanro - 0
Add a scaler
#5 opened by aromanro