IndicoDataSolutions/Passage

Stacking recurrent layers

kjancsi opened this issue · 2 comments

I get dimension mismatch error when trying to stack recurrent layers (LSTM or Gated) to create a deep recurrent network. Multiple dense layers seem to work fine though.

All but the last recurrent layer should have the seq_output argument set to True.

Example layer config:

layers = [
    Embedding(size=128, n_features=1000),
    GatedRecurrent(size=256, seq_output=True),
    GatedRecurrent(size=256, seq_output=True),
    GatedRecurrent(size=256, seq_output=False),
    Dense(size=1, activation='sigmoid')
]

That makes sense. Thanks a lot for the clarification.