Increasing memory usage when training nnet
Closed this issue · 1 comments
vrenkens commented
The RAM usage increases while training the neural net. This can cause it to crash. Specifically with GPU out of memory errors can occur. When training is resumed at the same iteration memory consumption is back to normal. I don't know the cause of this problem
vrenkens commented
Solved, operations where being added in the training loop