lightly-ai/lightly

VICReg Loss De-Means Twice?

RylanSchaeffer opened this issue · 1 comments

x = x - x.mean(dim=0)
std = torch.sqrt(x.var(dim=0) + eps)

I think the VICReg loss removes the mean, then calls .var() which also de-means (see: https://pytorch.org/docs/stable/generated/torch.var.html).

If I understand correctly, that seems unnecessary?

Hi @RylanSchaeffer, thank you for bringing this up. I think you are right, the de-mean is only necessary for the covariance term.