๐ŸŽ’ ๐Ÿ”ง Attention is all you need!

"A transformer is a deep learning model that adopts the mechanism of self-attention, differentially weighting the significance of each part of the input data." ~wiki.

โ€˜What I cannot create. I do not understand.โ€™ ~ Richard Feynman (1918 โ€“ 1988) ๐Ÿ“บ ๐Ÿ”—

Table of content

  1. Tutorials
  2. References

Clone repository

The github repository link is https://github.com/mxochicale/transformers-tutorials

To clone this repo, you might need to generate your SSH keys as suggested here. You can then clone the repository by typing (or copying) the following line in a terminal at your selected path in your machine:

cd && mkdir -p repositories/mxochicale && cd repositories/mxochicale
git clone git@github.com:mxochicale/transformers-tutorials.git

Issues

If you have questions or have experiment any problems, please open an issue.