facebookresearch/swav

Degraded DCv2 Training Loss

gorkaydemir opened this issue · 0 comments

Hi,
Congratulations on your great work and thank you for making it public.
I am training DCv2 on my custom dataset, however training (Cross Entropy) loss goes to zero in strictly decreasing manner. Moreover, it's linear evaluation performance degrades in each epoch. What could be the reason?
I have also tested with additional entropy term to prevent possible collapsing (like in swaw) but it did not help.

Thanks in advance.