algorithmiaio/langpacks

MemoryError in Python 2.7 Langpack

Closed this issue · 2 comments

I get a python error when I try to run memory intensive methods for the word2vec algorithm.

Locally speaking, when I load the model into memory it takes around 8.3 GB of memory and works without much of a problem.

How can we address this issue?

I also didn't tag this as a bug, as I'm not exactly sure if we're going increase memory limits for workers or not.

Fixed the problem by trimming the memory usage.