Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm.
The package can be installed from CRAN using:
install.packages('attention')
The development version, to be used at your peril, can be installed from GitHub using the remotes package.
if (!require('remotes')) install.packages('remotes')
remotes::install_github('bquast/attention')
Development takes place on the GitHub page.
https://github.com/bquast/attention
Bugs can be filed on the issues page on GitHub.