ValueError when fitting a model with get_melspektrogram_layer as first layer.
Mxgra opened this issue · 3 comments
Hello,
first of all, really nice work and good idea.
I tried, following the one-shot-example, to add a kapre layer as first layer to my model for keyword recognition (tensorflow dataset). The call to model.fit() raises following error:
ValueError: Creating variables on a non-first call to a function decorated with tf.function.
The model:
input_shape = (16000,1)
model = keras.Sequential()
melgram_layer = get_melspectrogram_layer(input_shape=input_shape, n_fft=2048, win_length=2018, hop_length=1024,
input_data_format='channels_last', output_data_format='channels_last',
sample_rate=16000, name='melspectro_layer')
model.add(melgram_layer)
model.add(Conv2D(32, (3, 3), strides=(2, 2)))
model.add(BatchNormalization())
model.add(ReLU())
model.add(GlobalAveragePooling2D())
model.add(Dense(num_labels))
model.add(Softmax())
model.compile('adam', 'categorical_crossentropy')
model.build()
Attached is a img of the error message.
Probably just a stupid mistake, but I'm stumped, can you help me?
Hi, I just tried this and didn't have the error.
>>> melgram_layer = get_melspectrogram_layer(input_shape=input_shape, n_fft=2048, win_length=2018, hop_length=1024,
... input_data_format='channels_last', output_data_format='channels_last',
... sample_rate=16000, name='melspectro_layer')
>>> model.add(melgram_layer)
>>>
>>> model.compile('adam', 'mse')
>>> model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
melspectro_layer (Sequential (None, 14, 128, 1) 0
=================================================================
Total params: 0
Trainable params: 0
Non-trainable params: 0
_________________________________________________________________
>>> import numpy as np
>>> x = np.zeros((1, 16000, 1))
>>> y = np.zeros((1, 14, 128, 1))
>>> model.fit(x, y)
1/1 [==============================] - 0s 472us/step - loss: 0.0000e+00
<tensorflow.python.keras.callbacks.History object at 0x7fcd148525d0>
I also tried with model.compile(run_eagerly=False)
in case if it's related, but it still works. Could you try some other model / melspectrogram layer only / turning on and off the eager mode to make sure? If it still has any problem with the most recent kapre, please share a fully reproducing code with me!
You're right, with mse loss function the model trains without problems... Can you imagine why this is? I thought mse is primarily a regression loss, while I'm doing classification.
Allright, as you suggested it also works with run_eagerly=False and categorical_crossentropy loss, nice 👍
Thank you for your quick answer and help! Really appreciate it.
so with the exact same loss but with categorical loss, you're having a problem?
again, i'd like to have a fully reproducible code :)