Meaningless Loss in tensorboard.
Closed this issue · 1 comments
iqddd commented
Why does the Loss value seem to have no meaning (neither current, nor averaged, nor averaged over an epoch)? It is set to some value at the very beginning of training (within the first epoch) and then fluctuates within a small range thereafter.
Shouldn't Loss be measured at the end of an epoch on validation images?
kohya-ss commented
The average loss is the moving average of the losses of the steps of 1 epoch up to that point. Therefore, there will be large fluctuations at first, and then the fluctuations will become smaller.
We are planning to implement verification loss, so please wait for a while