onnx转tflite后,推理速度变慢
Yakuho opened this issue · 3 comments
Yakuho commented
我先使用onnx转成tensorflow pb,代码如下
from onnx_tf.backend import prepare
import onnx
onnx_model = onnx.load("version-RFB-320.onnx")
tf_rep = prepare(onnx_model)
tf_rep.export_graph("version-RFB-320-tensorflow")
然后我再将pb转为tflite
import tensorflow as tf
saved_model_dir = 'version-RFB-320-tensorflow' # version-RFB-320-tensorflow 路径下含有pb和variables
save_tf_model = "version-RFB-320.tflite"
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
tflite_model = converter.convert()
open(save_tf_model, "wb").write(tflite_model)
但是我在本机上尝试使用version-RFB-320.tflite的时候,能够正常预测,进行nms选出人脸框,效果很不错。但是我测试了一下推理时间,花费了180ms左右!!!
本机配置 (CPU i5 8th, GPU GTX1050)
请作者们给我一些指导~~感谢
smilemakc commented
look at the graph of the model after prepare(onnx_model)
, and you will most likely be shocked by what you see
7aughing commented
look at the graph of the model after
prepare(onnx_model)
, and you will most likely be shocked by what you see
Truly, add too many transpose node in tflite
wei8171023 commented
look at the graph of the model after
prepare(onnx_model)
, and you will most likely be shocked by what you seeTruly, add too many transpose node in tflite
add many others node in tflite!!