Serialized engines are not portable across platforms?
Closed this issue · 1 comments
Bonsen commented
"https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#serial_model_python" shows
"Note: Serialized engines are not portable across platforms or TensorRT versions. Engines are specific to the exact GPU model they were built on (in addition to the platforms and the TensorRT version)."
But it seems u use python to generate trt and then u use c++ to predict?
Bonsen commented
It is portable across languages.