leftthomas/SimCLR

loss function problem

hendredlorentz opened this issue · 2 comments

Is there no need for l2 regularization in the loss function? This is not the same as the implementation of the (TORCH.NN.FUNCTIONAL.COSINE_SIMILARITY) function on the pytorch official website.

@hendredlorentz

return F.normalize(feature, dim=-1), F.normalize(out, dim=-1)

tks! Have a nice day!