princeton-vl/RAFT-Stereo

query about freeze BN from beginning

Torment123 opened this issue · 3 comments

Hi,
I found that in the training all the batchnorm2d layer are frozen from the very beginning, indicating that these set of parameters are randomly initialized without any learning, I'm a bit confused of why this step is needed? Thanks.

I believe batchnorm2d is initialized with mean 0 and variance 1 by default. We freeze the batchnorm2d parameters because empirically this worked well, although the difference was negligible.

Thanks for your fast response.

Thanks for your fast response.