ZFTurbo/Keras-inference-time-optimizer

Model is not correct when has multiple inputs and outputs

mrlzla opened this issue · 3 comments

At the last line of reduce_keras_model method one creates model that have only one input and one output, it's not true in general since keras model can get multiple inputs and multiple outputs.

Thank you. I guess we need to create some generic test for such case.

I also know the problem with complex models which included other models as layers. Code won't work on them as well. For example: RetinaNet.

The problem was fixed. Test with multi input and multi output was added.

Good job!