ethnhe/FFB6D

CUDA out of Memory with 2 x 3090 RTX

MiriamJo opened this issue · 2 comments

I wonder if somethings off with my code or if 2 3090RTX with 25GB Ram is simply not enough RAM? I turned the batch size all the way down, however, I have m,assive input size images with a size of 1440x1920. I also used the apex O1 optimizer and devided the sample_points by 48 instead of 24.

How can I further improve the code to save some memory? kind regards.

Alright, I tried it with 8x 3090 with 25gb memory and it still went out of memory. I really dont know what to do next. I added emoty_cache and things, but it always runs out of memory when training.

Alright, I tried it with 8x 3090 with 25gb memory and it still went out of memory. I really dont know what to do next. I added emoty_cache and things, but it always runs out of memory when training.

maybe you can lower the batch_size, I run it successfully on 3090 with batch_size = 3.