nagejacob/RecurrentMobileNet

issue about inference time

HuiiJi opened this issue · 1 comments

thanks for your code!
could you show your inference time@(256,256)?

Inferencing a 256*256 image (with out self ensemble) costs about 0.015s, on a tesla V100. So inferencing 1024 images with self ensemble costs about 30 minutes.