facebookresearch/frankmocap

CUDA error : out of memory with 4 RTX 2080 ti.

coreqode opened this issue · 5 comments

@coreqode
How do you use 4 GPU? Our code only supports single-GPU inference.

@penincillin Thanks for quick reply. Even for one GPU it was giving same error.

@coreqode
I truly haven't seen this problem before. There might be something wrong in installing the Detectron2 or the hand-detector, such as the wrong version of the CUDA. You can try to use the current environment to run the example code of Detectron2 or hand detector and see whether the same problems will accur or not.

Yeah, It was issue with cuda version.

@coreqode
Sounds good. Have you solved the problem?