leonzfa/iResNet

Slow inference on KITTI dataset

cattaneod opened this issue · 3 comments

I'm running your code on the KITTI dataset, but the runtime for processing a single image pair is around 0.35s instead of 0.12s as stated in the paper.

In order to run it at 0.12s i have to decrese the MAX_SIZE up to 300000, losing lot of details.

Can you help me run your code faster using full image size? thanks!

@catta202000 Their are two networks here. One is used in the CVPR paper, which corresponds to "deploy_iresnet.tpl.prototxt". The other is used in the ROB 2018, which corresponds to "deploy_iResNet_ROB.tpl.prototxt". The latter costs more time. In the CVPR paper, we did not enlarge the image, and just resized the images to integral multiple of 64, i.e., 384*1280 for KITTI.

When i use the MAX_SIZE stated in test_rob.py (1034496) i am getting a Failure with Aborted (core dumped).
I kept reducing MAX_SIZE till i reached 200000 in order for the testing to run. Although i am getting good disparity maps i think I am losing lots of details.
Can you please advise how to solve the Aborted error without changing the size ?
Thanks

@leonzfa Thanks for your help, using "iResNet_KITTI2015.caffemodel" and "deploy_iresnet.tpl.prototxt" without enlarging the images, now the model runs at around 0.18s. Still more that the 0.12s but i guess the problem now could be the hardware? I'm using a 1080ti