Inconsistency when batch size varies
ShengyuH opened this issue · 2 comments
ShengyuH commented
hi Chris,
I'm performing a classification task using MinkowskiEngine. I train the neural network with batch_size=8.
I set the batch_size to be 1 at test phase, and the result is quite bad, there's a big gap between validation set and test set. I increase the batch_size and the result is better.
Can you help explain this? I know the batch normalization layer might be the reason, but I've never encounter this phenomenon with other frameworks.
chrischoy commented
Maybe model.eval()?
ShengyuH commented
Yes!! I should miss model.eval(). Thanks!