A simple brain decoder for kinematic infomation decoding analysis.
japanese version README is here
Decoding finger flextion movement from ECoG signals.
The BCI competion Ⅳ dataset is used for this Demo.
- GUI support
Brain-kinematics-decoder has simple GUI for analysis.
So, you can easy to start Decoding analysis.
- Linux and Windows (MacOS X is not officially supported)
- Python 3.7
a. Create a conda virtual environment and activate it.
conda create -n BrainDecoder python=3.7 anaconda
conda activate BrainDecoder
b. Install MNE-Python following the official instructions, e.g.,
conda install -c conda-forge mne
c. Clone the Brain-kinematics-decoder repository.
git clone https://github.com/RyotaroNumata/Brain-kinematics-decoder.git
cd Brain-kinematics-decoder
d. Register BCI competition Ⅳ and Download Data sets 4.
After download BCI competition Ⅳ dataset no.4, you need to create directories for storing data as follows.
Brain-kinematics-decoder
├── FileIO
├── Model
├── SignalProcessing
├── Utils
└── data
└── BCI4
└──subject_ECoG_data
If you want to run analysis with GUI support, should run this script in your terminal or command prompt.
python GUImain.py
also, you can use Brain-kinematics-decoder on your IDE. If so, run this code as follows.
python Decodig_main.py
RyotaroNumata
Please feel free to contact me if you have any questions to the repo.