google/mannequinchallenge

Inference Performance

Tetsujinfr opened this issue · 2 comments

Hi
Thanks for sharing this promising work. I am planning to run this locally once I have passed all the install/config oops. I did not see perf data in the paper or the blog or this repo, but I may have missed it. I will share my results but would be nice if there was a list of perf for different hardware specs.
In the meantime, do you have some performance benchmarks to share?

fcole commented

I don't have a benchmark handy, sorry. Roughly from memory I'd say the RGB-only inference takes 0.5 seconds or so on a quadro p1000 (a relatively weak GPU).

Thanks for your reply. When I have a chance I share perf on the RGB only perf on my 980Ti which I was able to run easily. Initially I was more thinking of the full model inference time, but I do not think I will be able to run this one anytime soon given the code changes required to run it on video in the wild and measure it properly (i.e. including all processing steps, e.g. FlowNet2 run, MaskRCNN run etc).