Textless NLP: missing tacotron2 training script for GSLM
Opened this issue ยท 8 comments
๐ Bug
There is no script provided to train the tacatron2 in the following path: fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2
Can we please get the training script as well so that we are able to train our own decoder (modified tacotron2) on a new dataset?
Thanks!
@hikushalhere Can you please help us here?
@eugene-kharitonov could you please help with this? Thank you!
Can someone please help with this? Thanks
As far as I remember, the training code for modified Tacotron2 was not open sourced with GSLM. @wnhsu, can you help?
As far as I remember, the training code for modified Tacotron2 was not open sourced with GSLM. @wnhsu, can you help?
Thank you @hikushalhere for the update. Looking forward to @wnhsu's input on this.
@ahazeemi you can refer to the fairseq unit-to-speech example here, or the NVIDIA's Tacotron2 repo link.
Thank you for the reply. After training the Transformer/FastSpeech2 model on units (as described here), can that be used in place of the modified tacotron in the GSLM speech resynthesis?
I don't think you can replace that since it's a reimplementation