Eromera/erfnet_pytorch

Why do you have 2 consecutive batch norm layer?

Opened this issue · 0 comments

I'm just curious why you have 2 consecutive batch norm layer here. Also, is the encoder part for ImageNet and Cityscapes exactly the same? At least they seem to differ at this batch norm thing in the code.