/Transformer

🤖Implementation of Attention is All You Need from scratch🤖

Primary LanguagePython

Transformer network

Implementing a transformer network from scratch

Implementation based on paper: https://arxiv.org/pdf/1706.03762.pdf

Training model

First add to your shell config file (zshrc or bashrc) the following line: export PYTHONPATH="${PYTHONPATH}:~/Path/to/Transformer"

To train the model call actuation/train.py from the root of the transformer directory.