stanfordnlp/GloVe

memory issue on training glove

lfoppiano opened this issue · 1 comments

After several attempts, I reach the part of the demo.sh where it's training and dumping the model. Unfortunately I get the following error:

(base) [lfoppian0@sakura02 GloVe]$ build/glove -write-header 1 -save-file vectors -threads 10 -input-file cooccurrence.shuf.bin -x-max 10 -iter 100 -vector-size 300 -binary 2 -vocab-file vocab.txt -verbose 0
TRAINING MODEL
Read 128517745517 lines.
Using random seed 1633594412
Error allocating memory for W

The command glove does not have the -memory option, so I wonder if this is due to the fact that my shuffled coreference file is 1.2 TB...

Any clue or suggestion is welcome.

Thanks in advance