karfly/learnable-triangulation-pytorch

RAM usage keeps increasing during training

kyang-06 opened this issue · 0 comments

When trying to train your model from scratch, I notice that, during 1st epoch, RAM usage starts from < 10GB and then goes up to > 90 GB till out of memory. Observed in both algebraic and volumetric approach, which is weird.
Snipaste_2020-11-26_17-17-09