nousr/koi

CUDA out of memories

Closed this issue · 2 comments

I have 6G VRAM card

how to fix this issue ? RuntimeError: CUDA out of memory. Tried to allocate 1024.00 MiB (GPU 0; 5.81 GiB total capacity; 3.14 GiB already allocated; 780.44 MiB free; 3.17 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

i have create a 256X256 image

nousr commented

Please ensure you are using the fp16 model pipeline in the backend server. If that still doesn't work you may need to close any additional programs that are taking up GPU memory.

You are right on the cusp of what the hardware this model is running on so you will need to take all the optimizations you can.

If you still run into issues i suggest using google colab as your backend--this should give you some more breathing room!