soobinseo/Transformer-TTS

some problem about long text

Opened this issue · 1 comments

ttslr commented

Thanks for your great work!
I ran your code and it works well.

Btw, I use a dynamic "max_len" instead of 400 when I synthesis the speech.
But it has some errors when the long text was given since your position_embedding's max length is 1024 (such as https://github.com/soobinseo/Transformer-TTS/blob/master/network.py#17, https://github.com/soobinseo/Transformer-TTS/blob/master/network.py#63).
I think it's better to increase the number to make it work when feeding long text.

Thanks for your work again. :)

Thanks for your advice.

I will check it soon.