jiasenlu/HieCoAttenVQA

Clearify using num_layers as n in LSTM implementation

phisad opened this issue · 0 comments

Hello,

I try to re-implement your paper in Keras. Now, I'm struggling with your LSTM implementation.

You use num_layers as n for the LSTM initialization, but the num_layers should be the depth of the LSTM. Nevertheless, in the LSTM implementation it seems to be used as the number of timesteps L. Is this true?

self.core = LSTM.lstm(self.rnn_size, self.rnn_size, self.num_layers, dropout)

for L = 1,n do

Furthermore, there is createClones which creates multiple weights for each timestep as it seems. Is this supposed to be wanted as an LSTM should share the same weights through time or a Bug?

for t=1,self.seq_length do