How to achieve realtime inference?
John1231983 opened this issue · 7 comments
John1231983 commented
Hi, the inference code runs very fast. Could you tell me any trick to obtain the realtime inference? Does it run on GPU or CPU?
panda-lab commented
Android app running on arm cpu with ncnn. if you want design a realtime landmark detector, a lightweight network is needed.
John1231983 commented
lightweight network
I am using mobilenet v2 with alpha 0.25 as the paper mention. Do you use same thing or customize the mobilenet v2? Any suggestion?
panda-lab commented
mobilenet v2 is ok
John1231983 commented
Thanks. But I found that the speed is still slow. I used canvas to draw the landmark points. Is it fine?
panda-lab commented
If you running on android or ios devices, draw points based on opengl.
John1231983 commented
opengl is faster than canvas API function. Is it right? Thanks so much
panda-lab commented
yes