onnx/onnx-tensorflow

Export tensorflow lite model don't use onnx model input names and output names

linmeimei0512 opened this issue · 1 comments

My environment is:

  • Python version: 3.6
  • ONNX version: 1.11.0
  • ONNX-TF version: 1.10.0
  • Tensorflow-gpu version: 2.5.0

My final goal:
ONNX model convert to tflite.

Question:
My ONNX model input name is [input], output name is [output1, output2]
2022-10-27 17-03-40 的螢幕擷圖

ONNX convert to Tensorflow is use SavedModel format in Tensorflow 2.x.

import onnx
from onnx_tf.backend import prepare

onnx_model = onnx.load(onnx_model_path)
onnx_tf_exporter = prepare(onnx_model)
onnx_tf_exporter.export_graph(tensorflow_model_output_path)

I display output model (saved_model.pb) in Netron, input name is not equal to ONNX.
2022-10-27 17-02-48 的螢幕擷圖

Then I convert the SavedModel to tflie.

import tensorflow as tf

converter = tf.lite.TFLiteConverter.from_saved_model(tensorflow_model_path)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
tflite_model = converter.convert()
with open(tensorflow_lite_model_output_path, 'wb') as f:
    f.write(tflite_model)

Tflie model input name is equal to SavedModel [serving_default_input:0], output name is [StatefulPartitionedCall:1, StatefulPartitionedCall:1]
2022-10-27 17-18-23 的螢幕擷圖

Please how can I make tflite input name and output name equal to ONNX?

duplicate
#984