Rose-STL-Lab/DeepSTPP

Negative loss in training and validation

WSpie opened this issue · 0 comments

WSpie commented

I'm using my customized data and find the total training loss and validation loss may become negative when the iteration number increases, and I noticed the negative loss also appeared in the provided notebook file. Does it really matter? Can you explain a little bit? Thank you so much