ViLab-UCSD/MemSAC_ECCV2022

loss function

Closed this issue · 1 comments

Hi, thanks for your interesting work. I have a query about loss function in contrastive.py (line 65-66).
The denominator of contrastiveMatrix formula in line 66 will always be one due to the softmax operation in line 65. Is that OK given the Supervised Contrastive loss implementation in https://github.com/HobbitLong/SupContrast? Why softmax is applied?

65 - expScores = torch.softmax(confident_sim_matrix/self.tau, dim=0)
66 - contrastiveMatrix = (expScores * mask_sim).sum(0) / (expScores.sum(0))

We observed more gradient stability when we divide with the sum, although, as you rightly noted, it is always 1 so the ratio does not change. Does that answer your question?