MemoryError on windows with large dataset
Opened this issue · 0 comments
fontiago commented
Running on default main.py with my creds added, I'm getting some different stack traces sometimes also but it seems to consistently be pickle collapsing around 3 gigabytes of the data built up - as you can see in the screenshot I ought to have plenty of ram for it, before I ran the script I had 15.3 GB in use.
I do have uh, 33052 activities downloaded according to the tool, so that not be helping (or might not be practical to support). No big deal if you consider this too much of a pain to debug!
Start Timer 'Get all PGCRs from individual files'
Exception in thread Thread-3:
Traceback (most recent call last):
File "C:\Users\eden\AppData\Local\Programs\Python\Python37-32\lib\threading.py", line 926, in _bootstrap_inner
self.run()
File "C:\Users\eden\AppData\Local\Programs\Python\Python37-32\lib\threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\eden\AppData\Local\Programs\Python\Python37-32\lib\site-packages\multiprocess\pool.py", line 470, in _handle_results
task = get()
File "C:\Users\eden\AppData\Local\Programs\Python\Python37-32\lib\site-packages\multiprocess\connection.py", line 254, in recv
return _ForkingPickler.loads(buf.getbuffer())
File "C:\Users\eden\AppData\Local\Programs\Python\Python37-32\lib\site-packages\dill\_dill.py", line 386, in loads
file = StringIO(str)
MemoryError