syang1993/gst-tacotron

multi head attention

Young-Sun opened this issue · 2 comments

Hi, again.
The code is set to use mlp_attention, now.
Did the uploaded audio demo samples use mlp_attention?
Have you ever experimented using multi head attention? Do you have any audio samples?

Thanks.

Hi, the mlp_attention is part of the multi-head attention. dot_attention and mlp_attention are two different methods to compute attention weights in multi-head attention. The demo samples are trained using the default settings.

Sorry for mis-understand. Yes, both dot_attention and mlp_attenton are in multi-head attention.
I see, the audio samples synthesized using mlp_attention.
Thanks for your fast and kind reply. :-)