fatchord/WaveRNN

Pre-trained models.

anjanakethineni opened this issue · 2 comments

I am trying to get pre-trained models for other data sets. Can somebody tell me the exact steps to do pre-training?

Hi there, the steps to run training are in the main readme file. I'm not sure what the problem is here?

@fatchord Hi there, could you kindly share your hparams.py to reproduce your pretrained models ljspeech.tacotron.r2.180k.zip and ljspeech.wavernn.mol.800k.zip. And any suggestion on bigger batch size to speed up training? Thanks.

Tacotron:
The default configuration hparams.py results in very blur attention plots after 350k steps.

default @ 350k
1_griffinlim_351k

Compared to pretrained @ 180k
1_griffinlim_180k