allenai/scibert

How to finetune Scibert with multiple GPUs?

chloefresh opened this issue · 1 comments

I’d like to finetune scibert in multiple GPUs, not just one. But export CUDA_DEVICE=0 in train_allennlp_local.sh means only gpu0 is used. So where do I need to modify to utilize multiple GPUs?

I’d like to finetune scibert in multiple GPUs, not just one. But export CUDA_DEVICE=0 in train_allennlp_local.sh means only gpu0 is used. So where do I need to modify to utilize multiple GPUs?

Multiple gpu finetuning has been implemented in the codes using DataParallel from torch.
Maybe you can change export CUDA_DEVICE=0 to export CUDA_DEVICE=0,1,2,3
and you should change the way parsing this argument cuda_device in config files.