gesiscss/orc

Time issue when using external library

stannida opened this issue · 4 comments

Hi,

I am using an external library for inferring gender from image. I have a notebook (link to Binder) in which one can upload an image (or URL) and then the model predicts gender. When testing this notebook on a local machine it usually takes around 1-3 seconds for one image (highlighted part on the screenshot shows computation time, inherited from the library):
Screenshot 2020-11-27 at 16 40 18

And 50-60 seconds per image in the Binder environment:
Screenshot 2020-11-27 at 16 48 13

The model is saved in the same github repo and is downloaded once in the beginning. The time delay is happening during the call of function infer (this function), which is reading the json data and predicting the results (using pytorch).

It seems the problem might be with GPU, although when initializing the M3inference class the parameter use_cuda is set to False, which states whether to not run on a GPU. Parallelization is also effective when there are multiple GPUs available or by parameter num_workers in the infer method, which is set to 0 in the notebook.

Would appreciate any help or suggestions on what might be causing the time delay.

Best regards,
Aleksandra

Are you using a GPU locally?

It seems the problem might be with GPU, although when initializing the M3inference class the parameter use_cuda is set to False, which states whether to not run on a GPU. Parallelization is also effective when there are multiple GPUs available or by parameter num_workers in the infer method, which is set to 0 in the notebook.

We don't have GPUs on GESIS notebooks, I think Google Colab/Kaggle are the only publicly available service which has GPUs

Are you using a GPU locally?

No, I set the parameter to False and the screenshots are taken without using the GPU both locally and in Binder

Just to check, can you run the code locally with num_workers=2 (the GESIS notebooks limit). Thanks!

Closing the issue for the time being.
Let us know if you found a solution for this @stannida :)