ZFTurbo/Keras-inference-time-optimizer

Bug in layer names after conversion?

mrgloom opened this issue · 1 comments

Here is model summary of initial model before convertion:

Loaded model from models/model_raw.h5
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         (None, 64, 128, 3)        0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 64, 128, 16)       448       
_________________________________________________________________
batch_normalization_1 (Batch (None, 64, 128, 16)       64        
_________________________________________________________________
activation_1 (Activation)    (None, 64, 128, 16)       0         
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 32, 64, 16)        0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 32, 64, 32)        4640      
_________________________________________________________________
batch_normalization_2 (Batch (None, 32, 64, 32)        128       
_________________________________________________________________
activation_2 (Activation)    (None, 32, 64, 32)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 16, 32, 32)        0         
_________________________________________________________________
conv2d_3 (Conv2D)            (None, 16, 32, 16)        528       
_________________________________________________________________
batch_normalization_3 (Batch (None, 16, 32, 16)        64        
_________________________________________________________________
activation_3 (Activation)    (None, 16, 32, 16)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 16, 32, 128)       18560     
_________________________________________________________________
batch_normalization_4 (Batch (None, 16, 32, 128)       512       
_________________________________________________________________
activation_4 (Activation)    (None, 16, 32, 128)       0         
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 16, 32, 16)        2064      
_________________________________________________________________
batch_normalization_5 (Batch (None, 16, 32, 16)        64        
_________________________________________________________________
activation_5 (Activation)    (None, 16, 32, 16)        0         
_________________________________________________________________
conv2d_6 (Conv2D)            (None, 16, 32, 128)       18560     
_________________________________________________________________
batch_normalization_6 (Batch (None, 16, 32, 128)       512       
_________________________________________________________________
activation_6 (Activation)    (None, 16, 32, 128)       0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 8, 16, 128)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 8, 16, 32)         4128      
_________________________________________________________________
batch_normalization_7 (Batch (None, 8, 16, 32)         128       
_________________________________________________________________
activation_7 (Activation)    (None, 8, 16, 32)         0         
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 8, 16, 256)        73984     
_________________________________________________________________
batch_normalization_8 (Batch (None, 8, 16, 256)        1024      
_________________________________________________________________
activation_8 (Activation)    (None, 8, 16, 256)        0         
_________________________________________________________________
conv2d_9 (Conv2D)            (None, 8, 16, 32)         8224      
_________________________________________________________________
batch_normalization_9 (Batch (None, 8, 16, 32)         128       
_________________________________________________________________
activation_9 (Activation)    (None, 8, 16, 32)         0         
_________________________________________________________________
conv2d_10 (Conv2D)           (None, 8, 16, 256)        73984     
_________________________________________________________________
batch_normalization_10 (Batc (None, 8, 16, 256)        1024      
_________________________________________________________________
activation_10 (Activation)   (None, 8, 16, 256)        0         
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 4, 8, 256)         0         
_________________________________________________________________
conv2d_11 (Conv2D)           (None, 4, 8, 64)          16448     
_________________________________________________________________
batch_normalization_11 (Batc (None, 4, 8, 64)          256       
_________________________________________________________________
activation_11 (Activation)   (None, 4, 8, 64)          0         
_________________________________________________________________
conv2d_12 (Conv2D)           (None, 4, 8, 512)         295424    
_________________________________________________________________
batch_normalization_12 (Batc (None, 4, 8, 512)         2048      
_________________________________________________________________
activation_12 (Activation)   (None, 4, 8, 512)         0         
_________________________________________________________________
conv2d_13 (Conv2D)           (None, 4, 8, 64)          32832     
_________________________________________________________________
batch_normalization_13 (Batc (None, 4, 8, 64)          256       
_________________________________________________________________
activation_13 (Activation)   (None, 4, 8, 64)          0         
_________________________________________________________________
conv2d_14 (Conv2D)           (None, 4, 8, 512)         295424    
_________________________________________________________________
batch_normalization_14 (Batc (None, 4, 8, 512)         2048      
_________________________________________________________________
activation_14 (Activation)   (None, 4, 8, 512)         0         
_________________________________________________________________
conv2d_15 (Conv2D)           (None, 4, 8, 40)          20520     
_________________________________________________________________
global_average_pooling2d_1 ( (None, 40)                0         
=================================================================
Total params: 874,024
Trainable params: 869,896
Non-trainable params: 4,128
_________________________________________________________________
len(model.layers) 49

And after convertion using kito:

Loaded model from models/model_raw_kito.h5
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         (None, 64, 128, 3)        0         
_________________________________________________________________
batch_normalization_1 (Conv2 (None, 64, 128, 16)       448       
_________________________________________________________________
activation_1 (Activation)    (None, 64, 128, 16)       0         
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 32, 64, 16)        0         
_________________________________________________________________
batch_normalization_2 (Conv2 (None, 32, 64, 32)        4640      
_________________________________________________________________
activation_2 (Activation)    (None, 32, 64, 32)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 16, 32, 32)        0         
_________________________________________________________________
batch_normalization_3 (Conv2 (None, 16, 32, 16)        528       
_________________________________________________________________
activation_3 (Activation)    (None, 16, 32, 16)        0         
_________________________________________________________________
batch_normalization_4 (Conv2 (None, 16, 32, 128)       18560     
_________________________________________________________________
activation_4 (Activation)    (None, 16, 32, 128)       0         
_________________________________________________________________
batch_normalization_5 (Conv2 (None, 16, 32, 16)        2064      
_________________________________________________________________
activation_5 (Activation)    (None, 16, 32, 16)        0         
_________________________________________________________________
batch_normalization_6 (Conv2 (None, 16, 32, 128)       18560     
_________________________________________________________________
activation_6 (Activation)    (None, 16, 32, 128)       0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 8, 16, 128)        0         
_________________________________________________________________
batch_normalization_7 (Conv2 (None, 8, 16, 32)         4128      
_________________________________________________________________
activation_7 (Activation)    (None, 8, 16, 32)         0         
_________________________________________________________________
batch_normalization_8 (Conv2 (None, 8, 16, 256)        73984     
_________________________________________________________________
activation_8 (Activation)    (None, 8, 16, 256)        0         
_________________________________________________________________
batch_normalization_9 (Conv2 (None, 8, 16, 32)         8224      
_________________________________________________________________
activation_9 (Activation)    (None, 8, 16, 32)         0         
_________________________________________________________________
batch_normalization_10 (Conv (None, 8, 16, 256)        73984     
_________________________________________________________________
activation_10 (Activation)   (None, 8, 16, 256)        0         
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 4, 8, 256)         0         
_________________________________________________________________
batch_normalization_11 (Conv (None, 4, 8, 64)          16448     
_________________________________________________________________
activation_11 (Activation)   (None, 4, 8, 64)          0         
_________________________________________________________________
batch_normalization_12 (Conv (None, 4, 8, 512)         295424    
_________________________________________________________________
activation_12 (Activation)   (None, 4, 8, 512)         0         
_________________________________________________________________
batch_normalization_13 (Conv (None, 4, 8, 64)          32832     
_________________________________________________________________
activation_13 (Activation)   (None, 4, 8, 64)          0         
_________________________________________________________________
batch_normalization_14 (Conv (None, 4, 8, 512)         295424    
_________________________________________________________________
activation_14 (Activation)   (None, 4, 8, 512)         0         
_________________________________________________________________
conv2d_15 (Conv2D)           (None, 4, 8, 40)          20520     
_________________________________________________________________
global_average_pooling2d_1 ( (None, 40)                0         
=================================================================
Total params: 865,768
Trainable params: 865,768
Non-trainable params: 0
_________________________________________________________________
len(model.layers) 35

Looks like conv layer names are wrongly named with batch_normalization.

It's not really a bug, KITO just use name of second layer. Looks like I did it on purpose. I found related comment in code )

# We use batch norm name here to find it later
layer_copy.name = bn.name