Getting outofMemory error: CUDA
groundswel opened this issue · 4 comments
I get an error when I trying to use the model on a ml.g4dn.4xlarge instance.
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 592.00 MiB (GPU 0; 14.62 GiB total capacity; 14.33 GiB already allocated; 175.94 MiB free; 14.33 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
I am using the script in the README QuickStart section.
what gpu do u have
You need at least 27GB of GPU memory for the 7B parameter model (around 10GB for the 3B one)
You can load the model in 16-bit or 8-bit. If you know how to work with python, it shouldn't be too hard. If not,
There are projects like https://github.com/oobabooga/text-generation-webui that can handle this for you.
The official notebook also has a load_in_8bit checkbox.