TencentAILabHealthcare/scBERT

About pretraining process

QiaolinLu opened this issue · 5 comments

Hi, I am trying to reproduce the pretraining process, and I want to know the pretraining resources, about the number of GPUs and pretraining time. Thanks

Hi, the pre-training process takes about 32 GPUs and 1-2 week in a distributed manner.

Hi, which kind of GPU, and How large the memory is?

You can use V100.

thanks

Hi, I am trying to reproduce the pretraining process, and I want to know the pretraining resources, about the number of GPUs and pretraining time. Thanks
Hello, could you please send me a copy of the pre training script