NeuroTechX/EEG-ExPy

EEG stream passed QC but unsure about validity of recorded signal

leotozzi88 opened this issue · 4 comments

ℹ Computer information

  • Platform OS (e.g Windows, Mac, Linux etc): Mac Os 10.15.7
  • Python Version: Python 3.7.4
  • Brain Interface Used (e.g Muse, OpenBCI, Notion etc): Muse S

📝 Provide detailed reproduction steps (if any)

Dear Experts,

I am one of the participants to your NTCS Phase 1 experiment, I hope this is the right place to post my issue.

First of all, thank you for providing this great software and for the very clear instructions!

I have done 10x300s runs of the N170 experiment as instructed and I have repeatedly used the signal check tool to make sure that my electrodes were well-placed before acquiring the data. I checked before starting the experiment and every 2 runs, always passing the check.

After the experiment I plotted my CSV files to make sure the data had been acquired correctly before sending it to you and I found that in fact no data was recorded. I am not an EEG expert (I am an MRI researcher), but this looks pretty bad. Attached screenshot is the plot of electrode TP9, but they all look like this.

TP9

I have even attempted a rough ERP analysis following these instructions and I get pure noise (all epochs rejected as artifacts, no signal in the data): https://www.krigolsonlab.com/muse-analysis-matlab.html

The thing I wanted to flag is that my headset passed the quality check that you implemented as part of your experiment, but no signal was recorded.

I just bought the Muse S and so far I have been unsuccessful to actually detect any signal from it (I have tried even after your experiment). I would appreciate if someone knows what I could be doing wrong, so I can actually send you data for your study instead of noise.

Is there a way to actually plot the signal "live" using your toolbox? Maybe this could help me diagnose the problem.

Thank you very much,

Leonardo Tozzi

Hi Leo.

Thanks for the feedback.

The title of the issue says no signal recorded but if I understand correctly you have .csv files for every session, and you question is how to analyze them?

The code to do that can be found here

https://neurotechx.github.io/eeg-notebooks/auto_examples/visual_n170/01r__n170_viz.html

You may need to tweak the reject={'eeg': 5e-5} number. As a rule of thumb, you often need to reject at least the worst 5% of trials because there are inevitably some muscle-contaminated epochs. The sweet spot between rejecting bad data and keeping a reasonable amount of data is usually a rejection rate somewhere between 10-40%, depending on the data and how much of it you have.

We are also aiming to add a report-generating CLI program soon that implements this without the need for the python coding. Watch this space for that.

Quick Q: how did you do the bluetooth streaming? Did you use BlueMuse or BrainFlow? (i.e. did you specify device as muse or muse_bfn ). ( The new preferred option is brainflow as it is a bit more robust and a lot simpler than BlueMuse; but both are viable and have their advantages ).

As re: real-time signal visualizer -

This isn't something we have implemented in eeg-notebooks yet, but there are options from the two main streaming libraries:

if you are using bluemuse then this is simple - with a bluemuse stream running type

muselsl view -v 2

( note you may first need to do pip install vispy )

For brainflow, there is a also a brainflow real-time visualizer

https://brainflow.org/2021-07-05-real-time-example/

that does work but needs some tweaking IMHO.

We will incorporate usage of this into eeg-notebooks at some point soon also.

Hopefully however the analysis code listed above will allow you to get some ERPs from you recordings and you won't need to revisit the question of signal quality first (although always good to keep on top of this if you are able to put in the time).

Keep us posted how it goes, happy to clarify further.

j

Hi Leo.

Thanks for the feedback.

The title of the issue says no signal recorded but if I understand correctly you have .csv files for every session, and you question is how to analyze them?

The code to do that can be found here

https://neurotechx.github.io/eeg-notebooks/auto_examples/visual_n170/01r__n170_viz.html

You may need to tweak the reject={'eeg': 5e-5} number. As a rule of thumb, you often need to reject at least the worst 5% of trials because there are inevitably some muscle-contaminated epochs. The sweet spot between rejecting bad data and keeping a reasonable amount of data is usually a rejection rate somewhere between 10-40%, depending on the data and how much of it you have.

We are also aiming to add a report-generating CLI program soon that implements this without the need for the python coding. Watch this space for that.

As re: real-time signal visualizer -

This isn't something we have implemented in eeg-notebooks yet, but there are options from the two main streaming libraries:

For muselsl, which is the right choice for you I think as you are using a mac, you want to first initiate a stream in a second terminal with muselsl stream, and then start the viewer with

muselsl view -v 2

( note you may first need to do pip install vispy )

For brainflow, there is a also a brainflow real-time visualizer

https://brainflow.org/2021-07-05-real-time-example/

that does work but needs some tweaking IMHO.

We will incorporate usage of this into eeg-notebooks at some point soon also.

Hopefully however the analysis code listed above will allow you to get some ERPs from you recordings and you won't need to revisit the question of signal quality first (although always good to keep on top of this if you are able to put in the time).

Keep us posted how it goes, happy to clarify further.

j

Dear John,

Thank you for your thorough reply and I will make sure to check all those scripts!

I think I mislabelled my question and I apologize for that: I do have data. The thing that was puzzling me is that the data looks heavily contaminated by some high frequency noise, so I thought that the electrodes were not recording correctly. But I think maybe it's just because I am not used to EEG data: I have made more recordings and I see that the first step in many preprocessing streams is always a low-pass filter. So if the plot I sent above looks normal to you, I guess this is just how EEG data look!

Thank you also for your tips on visualizing the signal, I think this will help me a lot in understanding how to place the electrodes and how the signal should look!

Thank you very much,

Leonardo Tozzi

oreHGA commented

Thanks @leotozzi88 closing this issue. Please feel free to re-open if you have any more questions