Out of memory for ~1GB corpus
tim5go opened this issue · 5 comments
tim5go commented
I am using around 1GB Chinese data for training.
My server has around 30GB RAM, but it is still running out of memory.
May I know what is the proper hardware setting for such a corpus?
Thanks~
godmo commented
請問作者會更新到Tensotflow 1.2 或是用Keras改寫嗎?謝謝~
indiejoseph commented
@tim5go how is your batch size? i usually control the batch size / seq length to avoid OOM
indiejoseph commented
@godmo 想改寫用 Tensorflow 1.x+
tim5go commented
@indiejoseph
I used the default batch_size only, maybe is too large for huge corpus?
indiejoseph commented
@tim5go so you could try with only 1 batch first