loss_diff decrease to zero very fast
rshaojimmy opened this issue · 2 comments
rshaojimmy commented
Thanks for your nice codes!
During my training, the loss of differences decrease to 0 within the first 100 steps and remain 0 afterwards.
May I ask do you also encounter such case?
Thanks.
FLAWLESSJade commented
Hi, guys, have your solved it ? I meet same situation like this... T T
loss_dif and loss_simse are always keep 0 from epoch 0 to 99,it made me so confused :(
chenxi52 commented
I also encountered the same problem, and found that private code learn nothing, they are all zeros...., i didn't directly use the original experimental datasets, but i think it doesn't matter too much....