torch using all GPU RAM
YoSoyElZorlak opened this issue · 1 comments
YoSoyElZorlak commented
What is the necessary memory to use the 7B model? I have a 3060 card with 12GB of RAM and when I execute inference_example.py I get the following error:
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 172.00 MiB (GPU 0; 12.00 GiB total capacity; 11.29 GiB already allocated; 0 bytes free; 11.29 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
akanyaani commented
I have RTX 3090 which has 24GB memory still getting same error, did you find any solution?