Issues
- 3
Get rid of negative activation with relu GRUs
#11 opened by gchrupala - 2
Implement Net2DeeperNet for GRU layers
#10 opened by gchrupala - 2
Implement Xavier Initialization
#5 opened by kadarakos - 1
- 0
Dropout should rescale values at train time
#7 opened by gchrupala - 1
- 0
Fix the autoencoder example
#4 opened by gchrupala - 1
- 2
Referring by name
#3 opened by kadarakos