BatchNormalization throws error when change_ordering=True
oconnor127 opened this issue · 0 comments
oconnor127 commented
Hey,
I cannot convert a model which uses BatchNormalization Layers, because of dimension mismatches.. Assume the input tensor for BN is 48x112x112 (CHW) the parameters (e.g. gamma) has a size of 112, which is obviously wrong and should be 48 (HWC shape would be 112x112x48) ... However, until this problem is fixed properly, I circumvent it by modifying the onnx_to_keras() in converter.py
I added in line 229 (if layer['config'] and 'axis' in layer['config'])
if "epsilon" in layer['config']:
layer['config']['axis'][0] = 3
to swap the axis when its a BN layer (indicated by an "epsilon" or e.g. "gamma_initializer" , ... in the config)
Just to let people in future with the same problem know..