XiandaGuo/OpenStereo

Have you tried converting the PyTorch model via ONNX to a TensorRT engine for faster inference?

Closed this issue · 1 comments

Have you tried converting the PyTorch model via ONNX to a TensorRT engine for faster inference?

We haven't tried it yet, but we will incorporate it in our subsequent work. Please stay tuned.