soobinseo/Transformer-TTS

limit to small batch size in training

Opened this issue · 0 comments

It will OOM when batch size > 4. This is too slow for training and the gpu using rate is 20%.That is to slow for training.It has 11G cache per gpu and these is 8 gpu.