Eromera/erfnet_pytorch

What's the difference between the two weights?

MrLinNing opened this issue · 5 comments

hi, @Eromera
What's the difference between the two weights? If I train from scratch, what weight should I use?
image

Another question is why test the speed of erfnet_nobn.py ? not erfnet.py?
I test the speed of them on TITAN Xp, and the erfnet_nobn is 0.056s, the erfnet.py is 0.063s. (imagesize is 1024x2048)

@MrLinNing First is for encoder training (if (enc):) and second is for decoder training. It will choose the right one depending on if you use the --decoder flag while training.

In the forward pass time evaluation we use erfnet_nobn because it is right to think that batch norm layers can be absorved by conv layers by changing their statistics so these layers can be removed for calculation

@Eromera How to absorve the batchnorm layers to conv layer ? Can you give me the code example?

@MrLinNing I only know of this code from Torch. I haven't checked a code in Pytorch but its totally feasible thing to do in theory.

Hi, I have another question: I know that the two types of weights are for Encoder or Decoder, but what is the formula to compute them? If they are computed starting from the dataset should they be the same? Thank you