lightly-ai/lightly

deactivate_requires_grad does not fix the model!

Closed this issue · 2 comments

Hi I just noticed that in the examples the backbone is fixed by this line: https://github.com/lightly-ai/lightly/blob/master/lightly/models/utils.py#L168

It makes the requires_grad to variables to False. However, the batchnorm statistics (running_mean, running_var, etc.) are still being updated. You should make the batchnorm modules to eval mode if fixing the backbone models is desired.

Hi, thanks for bringing this up! This is correct and whether batch norm statistics should be updated or not depends on the model. As far as I know most models do not set batchnorm modules to eval, for example MoCo, DINO, and MSN all leave the batchnorm layers in train mode.

Closing this for now, please reopen if you have more questions :)