RenMin1991/Dyamic-Graph-Representation

The loss turned nan

vanessawei726 opened this issue · 0 comments

Thank u for your excellent work!

I was training on my dataset and the loss suddenly turned nan at around step 5000. I did the same training on the same dataset a few days ago and it went alright. The only thing I have changed is the total steps in the config_train_singlescale.py.

I think there is something wrong with the loss function. I tried to solve this but couldn't.