amansrivastava17/embedding-as-service

Memory and speed issue.

Opened this issue · 0 comments

I was working with sentences that are of length 200 approx and total sentences was 1000. But when I feed that to the model, my computer as well as google colab ran out of memory. Also, when I tried it with loop i.e feeding one by one, the process becomes too slow i.e for 2 hours of running the code it only found the embeddings of 40-50 sentences only. So, Is there any way to speed up the process?