LauraTTS模型的训练花了多长时间?
Dinxin opened this issue · 5 comments
Dinxin commented
LauraTTS模型的训练花了多长时间?
ZhihaoDU commented
The model is trained about 1.5 day on LibriTTS clean subset with an A800 GPU, and the batch size is 10240 tokens.
Dinxin commented
on 8 A100 GPUs? The total duration is 6000 hours ?
ZhihaoDU commented
Only one A800 GPU. I think the duration of LibriTTS clean subset is about 244 hours