CUDA error : out of memory with 4 RTX 2080 ti.
coreqode opened this issue · 5 comments
coreqode commented
penincillin commented
@coreqode
How do you use 4 GPU? Our code only supports single-GPU inference.
coreqode commented
@penincillin Thanks for quick reply. Even for one GPU it was giving same error.
penincillin commented
@coreqode
I truly haven't seen this problem before. There might be something wrong in installing the Detectron2 or the hand-detector, such as the wrong version of the CUDA. You can try to use the current environment to run the example code of Detectron2 or hand detector and see whether the same problems will accur or not.
coreqode commented
Yeah, It was issue with cuda version.
penincillin commented
@coreqode
Sounds good. Have you solved the problem?