/EEG-Transformer-seq2seq

Modified transformer network utilizing the attention mechanism for time series or any other numerical data. 6.100 project at MIT Media Lab.

Primary LanguagePython

EEG-Transformer-seq2seq

Modified transformer network utilizing the attention mechanism for time series or any other numerical data. 6.100 Electrical Engineering and Computer Science Project at MIT Media Lab.

The original NLP paper and Transformer model

publication of Google - https://arxiv.org/pdf/1706.03762.pdf

annotated transformer of Harvard NLP - http://nlp.seas.harvard.edu/2018/04/03/attention.html

Prerequisite

To know about the project and see the performance

final_report.pdf - the completed presentation of the project

To understand the code

code_explanation.pdf - all the functions are explained piece by piece

To train the model

EEG_train.ipynb - an example for the EEG (Electroencephalogram) dataset

LDS_train.ipynb - an example for the GLDS (gaussian linear dynamical systems) dataset

Authors & Mentors

Yingqi Ding (@dyq0811) - me

Ruyue Hong (@redevaaa) - co-author

Neo Mohsenvand (@NeoVand) - idea and guidance

Mehul Smriti Raje (@mraje16) - EEG preprocessing