onnx/onnx-coreml

Problem with Generated CoreML from Bidirectional LSTM on Pytorch

leonardoaraujosantos opened this issue · 2 comments

🐞Describe the bug

After CoreML model is created Xcode gives an Error message: (Problem Decoding document)

Trace

No error is found during generation
xcode_issue

This is the output I get from the export process

1/29: Converting Node Type Transpose
2/29: Converting Node Type Conv
3/29: Converting Node Type Relu
4/29: Converting Node Type MaxPool
5/29: Converting Node Type Conv
6/29: Converting Node Type Relu
7/29: Converting Node Type MaxPool
8/29: Converting Node Type ConstantOfShape
9/29: Converting Node Type Transpose
10/29: Converting Node Type Slice
11/29: Converting Node Type Slice
12/29: Converting Node Type LSTM
13/29: Converting Node Type Transpose
14/29: Converting Node Type Reshape
15/29: Converting Node Type Slice
16/29: Converting Node Type Slice
17/29: Converting Node Type LSTM
18/29: Converting Node Type Transpose
19/29: Converting Node Type Reshape
20/29: Converting Node Type Transpose
21/29: Converting Node Type Concat
22/29: Converting Node Type Concat
23/29: Converting Node Type Slice
24/29: Converting Node Type Slice
25/29: Converting Node Type Add
26/29: Converting Node Type Reshape
27/29: Converting Node Type Gemm
28/29: Converting Node Type Reshape
29/29: Converting Node Type LogSoftmax
[Core ML Pass] 6 disconnected constants nodes deleted
Translation to CoreML spec completed. Now compiling the CoreML model.
Model Compilation done.

To Reproduce

  • If a python script can reproduce the error, please paste the code snippet
from onnx_coreml import convert
# Convert pytorch model to onnx
def export_onnx(model, dummy_x, filename_onnx, input_names, output_names, verbose=True):
    # Export the model
    onnx_out = torch.onnx.export(model, dummy_x, filename_onnx, 
                                 input_names=input_names,
                                 output_names=output_names,
                                 export_params=True, verbose=verbose)
    return onnx_out

# Convert model from onnx to coreml(.mlmodel)
def export_onnx_core_ml(onnx_filename_in, filename_coreml, verbose=False):
        model = onnx.load(onnx_filename_in)
        coreml_model = convert(
            model,
            minimum_ios_deployment_target='13'
        )
        # Save to coreml filename
        coreml_model.save(filename_coreml)

System environment (please complete the following information):

  • coremltools version: 3.1
  • onnx version: 1.5.0
  • onnx-coreml version: 1.1
  • OS: Linux (Ubuntu 18.04)
  • How you install python: pip
  • python version: 3.7
  • Pytorch: 1.2.0
  • any other relevant information: One thing to note is that if coremltools or onnx are not those specific versions I have a SEGFAULT

Generated Models (ONNX and CoreML)

models_onnx_coreml.zip

After @aseemw PR:#531, the model is generated correctly:
Screen Shot 2020-01-09 at 3 03 07 PM

best_LSTM_working.mlmodel.zip

Hi guys, also tested on the release 1.2 and works correctly.