amansrivastava17/lstm-siamese-text-similarity

Why did the training stop early

hyybuaa opened this issue · 3 comments

Epoch 8/200

64/450 [===>..........................] - ETA: 0s - loss: 0.8007 - acc: 0.5781
192/450 [===========>..................] - ETA: 0s - loss: 0.7411 - acc: 0.5781
320/450 [====================>.........] - ETA: 0s - loss: 0.7564 - acc: 0.5656
448/450 [============================>.] - ETA: 0s - loss: 0.7578 - acc: 0.5491
450/450 [==============================] - 0s 813us/step - loss: 0.7569 - acc: 0.5489 - val_loss: 0.8013 - val_acc: 0.4490

it stopped!

@hyybua As we have added the early stopping condition in our model definition, so if acuuracy is not increasing for more than 3 epochs, training will stop, this generally helps to save time in condition where model stops learning or can say there are no further updates in weights after few epochs.

Hi
What if you just want it to train anyway?

Remove the early stopping condition from callbacks