Modified transformer network utilizing the attention mechanism for time series or any other numerical data. 6.100 Electrical Engineering and Computer Science Project at MIT Media Lab.
publication of Google - https://arxiv.org/pdf/1706.03762.pdf
annotated transformer of Harvard NLP - http://nlp.seas.harvard.edu/2018/04/03/attention.html
- Python 3.6+
- Pytorch Stable(1.1)
- SciPy
- PyLDS
final_report.pdf - the completed presentation of the project
code_explanation.pdf - all the functions are explained piece by piece
EEG_train.ipynb - an example for the EEG (Electroencephalogram) dataset
LDS_train.ipynb - an example for the GLDS (gaussian linear dynamical systems) dataset
Yingqi Ding (@dyq0811) - me
Ruyue Hong (@redevaaa) - co-author
Neo Mohsenvand (@NeoVand) - idea and guidance
Mehul Smriti Raje (@mraje16) - EEG preprocessing