gaozhihan/PreDiff

The loss become negative when training PreDiff

bugsuse opened this issue · 2 comments

I want to apply the PreDiff to my dataset for precipitation nowcasting, and I have already trained a Variational Autoencoder (VAE), and everything is normal. However, during the training of PreDiff, I encountered a situation where the loss became negative. In the PreDiff configuration file, apart from changing the data path and size of the sample (input length, target length, height and wight), the rest of the parameters remained unchanged. Upon checking the logs, I found that the loss of logvar was negative. Do you have any suggestions? Any help would be appreciated!

WechatIMG19749
WechatIMG19747
WechatIMG19779
WechatIMG19780
WechatIMG19781

Thank you for raising this question. The logvar parameter is directly adopted from the original Stable Diffusion implementation without modification. This phenomenon is also observed there:

CompVis/stable-diffusion#380

According to our trials, it is common for loss and logvar to become negative during training, without any constraints placed on limiting their range.

Get it. Thanks a lot!