몇 개의 주요 NLP 논문을 읽고 bentrevett/pytorch-seq2seq 를 참고하여 모델 구현과 훈련을 진행함
- Sutskever, I., Vinyals, O., & Le, Q. V. (2014). Sequence to sequence learning with neural networks. Advances in neural information processing systems, 27.
- Sequence to Sequence learning with the LSTM model
- Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
- Sequence to Sequence learning with the GRU model
- Bahdanau, D., Cho, K., & Bengio, Y. (2014). Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473.Chicago.
- The same kind of learning with the attention modules