Nan loss occurs while training
lyp0413 opened this issue · 2 comments
lyp0413 commented
There is a little bug in src/losses/losses.py 52 & 53,which may cause a nan loss while training.
I think a epsilon like 1e-10 should be added for avoiding log(0).
maudzung commented
What epoch did you face with the NaN loss? I trained the model without NaN loss. Please make sure that you are using the latest code in the repo.