About GPU memory
Closed this issue · 3 comments
Aristot1e commented
hi! I tried to run your code on double GPU, RTX3090(24G), but it didn't work, the error message showed "cuda:out of memory".
z-fabian commented
Hi,
Can you please let me know what commands you used to run the code? Make sure that the use_checkpointing
flag is set to reduce GPU memory. You can see all the hyperparameters and arguments when the code starts running as well. Otherwise, GPU memory shouldn't be an issue for the default models, we also used 24GB GPUs for the experiments.
z-fabian commented
Hi @Aristot1e, did you manage to run the code? Please let me know if there is still some issue.
Aristot1e commented
I have solved this problem, and running the code successfully. Thank you so much.