Loss value change to NaN
Closed this issue · 3 comments
Hello sounakdey,
i tried reproduce your result from paper, and everytime on the end of first epoch of training phase,
the loss value from the model changes to NaN. I think that is some configuration in Keras-JS that needs to be changed, so i tried put the value of epsilon, that's by the default 1e-07, to 1e-05 to avoid this problem. You passed for something like this when you trained your model?
Maybe is my Keras or Tensorflow Version? Whats is the version that you use?
Thank you.
No i didnot change the epsilon value.... the code provided is running properly i checked it... i used keras 2.0.8 and tensorflow1.3.0.... even with my previous keras 1 it runned porperly...
Hello again sounakdey,
firsty thanks for answer my question, but the code continued to crack.
I see that you use the example of siamese network from keras.
So i found a solution. I saw that the siamese network from keras had pieces of code that had to be changed.
Like this:
def euclidean_distance(vects):
x, y = vects
return K.sqrt((K.sum(K.square(x - y), axis=1, keepdims=True),)
for this:
def euclidean_distance(vects):
x, y = vects
return K.sqrt(K.maximum(K.sum(K.square(x - y), axis=1, keepdims=True), K.epsilon()))
Now the code is running well.
Ahh I got it... It seems it needs a epsilon to stop the squared distance falling beyond certain value... Great to hear that the code is running perfectly...