/seq2seq-with-attentation-batch-pytorch

It's the baseline model (seq2seq with attention mechanism) for the natural language generation

Primary LanguageJupyter Notebook

Stargazers