Inference speed
Closed this issue · 1 comments
Tacha-S commented
When running demo.py, it takes 8-10 seconds for gedi.compute() to execute. (Run on GTX 1660 Ti)
The paper mentions that Gedi runs in 1.454ms, but does demo.py achieve a similar speed with its settings?
If the settings differ, what is the approximate execution speed for demo.py?
Tacha-S commented
inference time per descriptor
I understand.
It means 1.454 ms / descriptor * 5k