About pretraining process
QiaolinLu opened this issue · 5 comments
QiaolinLu commented
Hi, I am trying to reproduce the pretraining process, and I want to know the pretraining resources, about the number of GPUs and pretraining time. Thanks
TencentAILabHealthcare commented
Hi, the pre-training process takes about 32 GPUs and 1-2 week in a distributed manner.
QiaolinLu commented
Hi, which kind of GPU, and How large the memory is?
TencentAILabHealthcare commented
You can use V100.
QiaolinLu commented
thanks
SELECT-FROM commented
Hi, I am trying to reproduce the pretraining process, and I want to know the pretraining resources, about the number of GPUs and pretraining time. Thanks
Hello, could you please send me a copy of the pre training script