Dilations and nb_stack relationship
rhaghi opened this issue · 4 comments
rhaghi commented
Hi,
First, thanks a lot for making this package. It is very beneficial to me.
I am trying to make a NN model with the following code, but I face some issues I don't understand. I want the dilation to be one single number. Something like dilations = (2,)
. The code is:
input_dim = 9
i = Input(batch_shape=(None, time_steps, input_dim))
d1 = (1,)
d2 = (2,)
d3 = (4,)
nbf =2
ks1 = 128
ks2 = 64
ks3 = 32
nbs =1
tcn1 = TCN(nb_filters=nbf,
kernel_size=ks1,
dilations=d1,
return_sequences=True,
nb_stacks=nbs,
padding='same')(I)
tcn2 = TCN(nb_filters=nbf,
kernel_size=ks2,
dilations=d2,
return_sequences=True,
nb_stacks=nbs,
padding='same')(tcn1)
tcn3 = TCN(nb_filters=nbf,
kernel_size=ks3,
dilations=d3,
return_sequences=True,
nb_stacks=nbs,
padding='same')(tcn2)
o = Dense(1,activation='linear')(tcn3)
m = Model(inputs=[i], outputs=[o])
m.compile(optimizer='adam', loss='mse')
history = m.fit(x,y, batch_size=12, epochs=300, validation_split=0.2, shuffle=True, callbacks=None)"
And then the error is:
ValueError: Exception encountered when calling layer "tcn_56" (type TCN).
in user code:
File "/usr/local/lib/python3.7/dist-packages/tcn/tcn.py", line 341, in call *
x = layers.add(self.skip_connections, name='Add_Skip_Connections')
File "/usr/local/lib/python3.7/dist-packages/keras/layers/merge.py", line 791, in add **
return Add(**kwargs)(inputs)
File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/usr/local/lib/python3.7/dist-packages/keras/layers/merge.py", line 92, in build
raise ValueError('A merge layer should be called '
ValueError: A merge layer should be called on a list of at least 2 inputs. Got 1 inputs. Full input_shape received: ListWrapper([(None, 1200, 2)])
Call arguments received:
• inputs=tf.Tensor(shape=(None, 1200, 9), dtype=float32)
• training=False
• kwargs=<class 'inspect._empty'>
If I change nb_stacks = 2
, the code will work without any problem (the loss is high, but it runs!) What am I overlooking?
philipperemy commented
@rhaghi I think it's a bug. TCN does not seem to work with a dilation of just one value. There's a merge layer somewhere that is problematic. I'll see how to fix that.
philipperemy commented
For the loss, I guess it's normal that it's high at the beginning if you stack layers.
philipperemy commented
That one should fix it: #238.
philipperemy commented
Pushed in 3.5.0.