peteryuX/retinaface-tf2

inference time

AlexArtemis opened this issue · 0 comments

I test the inference time of this model used the 1920*1080 size image on the platform of nvidia v100, I use tensorflow serving and it cost about 170ms for signle image. Do you know the reason caused the time gap between paper and the model?
THX
@peteryuX @magikerwin1993