fabio-sim/LightGlue-ONNX

Nvidia Tx2 is hard to use

Closed this issue · 3 comments

hi, I have run the demo successfully on PC.
But when I temp to use it on Tx2, it nearly makes me collapse .

Opset 16 is required by LightGlue.onnx, but Tx2 can only support onnxruntime 1.11.0, whose opset is till 15.
And torch on Tx2 requires python==3.6, which stop me from installing onnxruntime>1.12.0.

Have you got any idea?

Hi @demonove, happy to hear that LightGlue-ONNX is useful for you.

ONNX opset 16 is only required because of SuperPoint's grid sample operation. See #19 for more information. DISK-LightGlue supports opset 12+:

I hope you find these models helpful!

Hi @fabio-sim, thanks for the models you've given.
I found another question!

When using export.py to turn a onnx mode, it will include sdpa.py which import torch.onnx._constant, torch.onnx._type_utils, torch.onnx.symbolic_helper. All these imports are all offered by torch 2.0, but Tx2 can only use torch 1.10.

Is there any way makes export.py support for torch 1.10?

Hi @demonove

Once you already have the exported ONNX models, you don't need to install PyTorch for inference. You only need ONNXRuntime:

matplotlib # Viz
numpy
# onnxruntime # CPU only
onnxruntime-gpu # CUDA
opencv-python

However, if for some reason you are required to export on device, you can ignore sdpa.py ops by changing the following line:

from lightglue_onnx.ops import patch_disk_convolution_mode, register_aten_sdpa

- from lightglue_onnx.ops import patch_disk_convolution_mode, register_aten_sdpa
+ from lightglue_onnx.ops.convolution_mode import patch_disk_convolution_mode