Wulingtian/RepVGG_TensorRT_int8

Serialized engines are not portable across platforms?

Closed this issue · 1 comments

"https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#serial_model_python" shows
"Note: Serialized engines are not portable across platforms or TensorRT versions. Engines are specific to the exact GPU model they were built on (in addition to the platforms and the TensorRT version)."

But it seems u use python to generate trt and then u use c++ to predict?

It is portable across languages.