CMPT 419 (Machine Learning) class Final Report
Fall 2018 term, arxiv submission in progress.
Abstract
Modified RNN based Seq2Seq Architecture (i.e Tacotron) to reduce model size and help speed up training, the following is implemented:
- Self attention (Similar to transformer paper)
- Guided Attention & Forced incremental Attention (From DCTTS paper)
- grapheme & Phoneme mixing input
Audio Samples
Listen to audio samples here.