Can I inference with Valley-13b by V100 GPUs?
TonyXuQAQ opened this issue · 3 comments
TonyXuQAQ commented
As you mentioned before, I can train Valley-13b by 16 V100 gpus with deepspeed zero3. I wonder whether I can infer with Valley-13b by V100. Does valley support multi-GPU inference?
RupertLuo commented
Yes,you need to set the device of model to 'auto' like this:
model = ValleyLlamaForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map='auto')
TonyXuQAQ commented
Thanks for the prompt reply!
fightingaaa commented
Yes,you need to set the device of model to 'auto' like this:
model = ValleyLlamaForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map='auto')
hello,By setting the device of model to 'auto' , the following error occurred. How to solve it? Thanks in advance
Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cuda:0!