iassael/torch-bnlstm
Batch-Normalized LSTM (Recurrent Batch Normalization) implementation in Torch.
Lua
Issues
- 1
- 1
Variance Epsilon Not Reported
#4 opened by NickShahML - 0
Comparison with Bayesian LSTMs
#6 opened by jnhwkim - 0
have you tried with lstm with bn, but with dropout value setting as 0.5? can you provide such comparison as well?
#7 opened by eriche2016 - 3
Gamma initialization?
#5 opened by cooijmanstim - 1
Integrating bnlstm into rnn
#3 opened by yobibyte - 2
Just one quick question on LinearNB
#2 opened by hohoCode - 1
`in_transform` shouldn't have batchnorm
#1 opened by abhshkdz