MoCo Implementation
czhaneva opened this issue · 6 comments
Thank you for your work!
How to modify the loss to MoCo style.
Thanks!
Actually, you don't need to modify the loss as SimCLR and MoCo have the same (almost!) InfoNCE loss. You just need to feed the representations of the query and key into the current loss function. This is the example:
loss_fn = dcl.DCL(temperature=0.5)
loss = loss_fn(query, key)
In the above code, query
and key
are the representations obtained after query and key heads in the MoCo framework, respectively.
Please let me know if this doesn't make sense.
PS: If you are going to test this on MoCo, it would be greatly appreciated if you could add it to this repo too!
Thanks for your reply.
Both SimCLR and MoCo use the InfoNCE loss, but they differ in the definition of positive and negative samples.
SimCLR samples positive and negative pairs within a mini-batch.
In MoCo, the positive pairs sampling strategy is similar to SimCLR, but the negative pairs are sampling from the key's queue.
So, I'd like to know how to modify it with MoCo's queue style. (As discussed in the paper Fig.3 (b))
Yes, it's right. That's why I said they have almost the same loss. I think the above approach should still work in this scenario (it might even work better as it gives the model more negative samples). However, if you would like to use the exact same negative sample selection of MoCo, you can replace the below lines in the dcl.py
file:
neg_similarity = torch.cat((torch.mm(z1, z1.t()), cross_view_distance), dim=1) / self.temperature
neg_mask = torch.eye(z1.size(0), device=z1.device).repeat(1, 2)
with the below code:
neg_similarity = cross_view_distance / self.temperature
neg_mask = torch.eye(z1.size(0), device=z1.device)
Thank you very much, I'd like to try it with MoCo, and I will add the results to this repo when I finished this.
Many thanks!
Thank you very much, I'd like to try it with MoCo, and I will add the results to this repo when I finished this.
Have you implemented DCL on MoCo? Can you share your code with me? Thank you so much!