Bidirectional RNN Argument Issue
Closed this issue · 3 comments
Hi,
I am getting the following error while using bidirectional RNN with 1DCNN.
error:
File "build/bdist.linux-x86_64/egg/keras/layers/containers.py", line 68, in add
File "/home/user1/keras-master/seya/seya/layers/recurrent.py", line 55, in set_previous
self.forward.set_previous(layer, connection_map)
TypeError: set_previous() takes exactly 2 arguments (3 given)
My model is:
forward_lstm = LSTM(input_dim=32, output_dim=32, return_sequences=True)
backward_lstm = LSTM(input_dim=32, output_dim=32, return_sequences=True)
brnn = Bidirectional(forward=forward_lstm, backward=backward_lstm, return_sequences=True)
model = Sequential()
model.add(Convolution1D(input_dim=10,
input_length=100,
nb_filter=32,
filter_length=7,
border_mode="valid",
activation="relu",
subsample_length=1))
model.add(MaxPooling1D(pool_length=3, stride=3))
model.add(Dropout(0.2))
model.add(brnn))
model.add(Dropout(0.5))
model.add(Flatten())
model.add(Dense(output_dim=128))
model.add(Activation('relu'))
model.add(Dense(output_dim=1))
model.add(Activation('sigmoid'))
Hi, thanks for the issue @MdAsifKhan!
Keras had a few API changes and set_previous
was probably modified.
I'll look at that soon. But if you are able fix the problem, please make us a PR and I'd love to review your contribution.
Best,
-eder
The function 'set_previous' was modified to {def set_previous(self, layer, reset_weights=True)} in Keras,
I just delete the third parameter and it works.
@zzukun cool! would you make a PR?