/Neural-Machine-Translation

Aims to study and implement LSTM Seq2Seq models for Machine Translation.

Primary LanguageJupyter Notebook

Machine-Translation-model

Aims to study and implement Seq2Seq LSTM models for machine translation.
The documentation of this project will be updated here shortly.

Project Timeline

Week 1: Literature Review

  • Read articles and go through papers and tutorials about the implementation of LSTM Seq2Seq models for neural machine translation.

Week 2: Dataset and Model implementation

  • Look for the dataset of appropriate size, to provide enough variety, for the training of the model.
  • Implement and train the model on the data set.

Week 3: Testing

  • Test and improve the model to reach maximum efficiency.