Deserialization error with `Lambda` layers when loading saved model in new environment
bourcierj opened this issue · 0 comments
When one saves a converted Keras model with Lambda
layers to disk and loads it in a completely separate environment, one will get errors like TypeError: Exception encountered when calling layer "LAYER_173_CHW" (type Lambda)
. This is because Lambda
layers are fundamentally non-portable across environments (cf. Keras docs).
The root cause is given in warnings right before the exception is raised:
[...]UserWarning: onnx2keras.pooling_layers is not loaded, but a Lambda layer uses it. It may cause errors.
To solve this: replace all `Lambda` layers with native or custom layers.
[...]UserWarning: onnx2keras.reshape_layers is not loaded, but a Lambda layer uses it. It may cause errors.
To solve this, we need to replace all Lambda
layers with native or custom layers (or at least in this example where the model is ResNet18, those related to pooling and reshape layers).
However Lambda
layers are everywhere in this library, so this could require a great amount of work to cover all cases.
Steps to reproduce:
- convert ResNet18 from PyTorch to Keras (similarly to what's done in
test.models.test_resnet18.test_resnet18
) - save the keras model to a SavedModel format with
tf.keras.Model.save(..., save_format="tf")
- load the saved model with
tf.keras.models.load_model
in a completely separate environment