facebookresearch/chameleon

Does the 30B Model from hugginface support Single GPU Inference ?

neoyxm opened this issue · 1 comments

HI guys,

I have one A100 80G so far, and when loading the model, the world_size is 0. I'm not if the 30B can run on single GPU or not.

if the single gpu is not supported, and if one day I get enough cards, can I enable multi-gpu by running the following cmd ?
CUDA_VISIBLE_DEVICES='0,1,2,3' python -m chameleon.miniviewer --model-size 30b

This can work.

close this issue.