0.25° GenCast GPU Memory Requirements
Closed this issue · 2 comments
Hi, just wondering if you have an estimate on how much GPU RAM is needed to run the 0.25° version of GenCast. It looks like I ran out of memory on a 48gb Nvidia L40.
Says in the doc here you will need ~60GB for inference on GPU:
https://github.com/google-deepmind/graphcast/blob/main/docs/cloud_vm_setup.md#running-inference-on-gpu
I managed to get it run on a H100, which has 80GB vRAM cap and can see it consumed ~68GB of vRAM
It seems that the 0.25° GenCast model requires around 60GB of GPU memory for inference, as mentioned in the documentation. It's interesting to note that you ran into memory issues on the Nvidia L40 with 48GB, but this seems within the expected range. Using a GPU with higher memory, like the H100 with 80GB VRAM, might help.