zjy526223908/DreamEditor

question on torch.cuda.OutOfMemoryError when running # dreambooth part of run_step2.sh.

Opened this issue · 1 comments

Hello, I have a question on # dreambooth part of run_step2.sh.

When I run this part, I encounter the following error:
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 58.00 MiB. GPU 0 has a total capacty of 23.69 GiB of which 38.75 MiB is free. Process 2256234 has 5.01 GiB memory in use. Including non-PyTorch memory, this process has 18.63 GiB memory in use. Of the allocated memory 18.17 GiB is allocated by PyTorch, and 117.72 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Steps: 0%|▏ | 1/400 [00:02<18:30, 2.78s/it, loss=0.113, lr=5e-6]

May I ask what size of GPU memory you are using?
Thanks a lot!

Hey I have similar issues using a 4090! They state that they are building upon StableDreamFusion which requires a V100 (didn't test it yet).