lucidrains/deep-daze

torch wont use my GPU or so i think.

batbeat opened this issue · 1 comments

I'm not sure why but I don't think torch is using my GPU (RTX 2080 super) it gives me this error

error:
:\Users\beaty\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\cuda\amp\grad_scaler.py:115: UserWarning: torch.cuda.amp.GradScaler is enabled, but CUDA is not available. Disabling. warnings.warn("torch.cuda.amp.GradScaler is enabled, but CUDA is not available. Disabling.")

Any help here?

I had to re-install PyTorch with CUDA enabled. pip install deep-daze didn't get me a version with it enabled for some reason.

I first checked if torch is happy with CUDA by opening a python terminal and entering:
import torch
torch.cuda.is_available()
This returned false for me, hinting that whatever PyTorch I had didn't have CUDA support.

I then checked which CUDA version my GPU supported by running nvidia-smi at a command prompt. The version will be in the top right of the displayed results.

Go to the PyTorch website and select the right options for your installation, pip, conda etc and select the CUDA Compute Platform version. Even though my GPU was using version 11.4, the 11.1 option worked fine. My working options were:

  • PyTorch Build: Stable
  • My OS: Windows
  • Package: Pip
  • Language: Python
  • Compute Platform: CUDA 11.1

Then copy the command that the website gives you into a command prompt and then wait.
When I ran imagine again all was good. torch.cuda.is_available() now returns true also.