kowalej/BlueMuse

Contact Status Indicator

Closed this issue · 9 comments

I have used bluemuse for data acquisition for muse headband 2. But bluemuse does not provide "Contact Status Indicator'. Has anyone successfully integrated this function to bluemuse source code?

Thanks!

I'm not sure what Contact Status Indicator means. Are you referring to how well the headbands is seated on your head (i.e. how well electrodes are contacting skin)? I'm not aware of any way to get this data from the Muse, I do know there is one Bluetooth characteristic that seems to spit out data while streaming data from the Muse, so maybe this information is contained there.

I'm not sure what Contact Status Indicator means. Are you referring to how well the headbands is seated on your head (i.e. how well electrodes are contacting skin)? I'm not aware of any way to get this data from the Muse, I do know there is one Bluetooth characteristic that seems to spit out data while streaming data from the Muse, so maybe this information is contained there.

Headband contact status (or horseshoe indicator). It indicates how well the muse headband contact with the user's head (1 for good contact, 2 for fair contact, 4 for bad contact). Yeah, previously we could get this data via muse sdk. But now they do not provided muse sdk any more.

I'll will take a look at that extra data channel I was talking about and see if the data matches what the horseshoe provides. However, I have a feeling this metric was generated via software in the SDK , and not on the actual headband.

I pulled some data from this channel and converted to binary, but I couldn't really figure out what the data structure was. The samples are attached to this comment, I posted 3 sets (first set is wearing the headband properly, second set is purposely messing it up, 3rd set is with the headband off my head).

Every other line has "TS: xxxx" which was extracted from the first 16 bits (as usual with the Muse packets) and is just a packet counter. Below each "TS... " line you will see the raw binary (including the 16 bits of the timestamp).

So yea, I'm not sure what this channel is measuring, but it seems that the values are fairly different between wearing properly and off head, but these values could be voltage levels or something totally different.

unknown-channel-muse-values.txt

I pulled some data from this channel and converted to binary, but I couldn't really figure out what the data structure was. The samples are attached to this comment, I posted 3 sets (first set is wearing the headband properly, second set is purposely messing it up, 3rd set is with the headband off my head).

Every other line has "TS: xxxx" which was extracted from the first 16 bits (as usual with the Muse packets) and is just a packet counter. Below each "TS... " line you will see the raw binary (including the 16 bits of the timestamp).

So yea, I'm not sure what this channel is measuring, but it seems that the values are fairly different between wearing properly and off head, but these values could be voltage levels or something totally different.

unknown-channel-muse-values.txt

Thank you too much for your great helps. I have tried to read this file and understand these digits's meanings. But i still could not figure out what these numbers stand for. But definitely, should not to be horseshoe indicator.

Maybe you are right. These indicators are generated through their algorithms. In muse sdk, the frequency of indicator is 10 Hz.

If i found anything new and useful, i would let you know in this issue. Thanks again!

oori commented

you can mimic the horseshoe by standard deviation on each channel.
see: alexandrebarachant/muse-lsl#98 (comment)

here's an example using python:

quality = np.std(data_epoch, axis=0)
horseshoe = list(map(horseshow_value, quality))
client.send_message("/elements/horseshoe", horseshoe)

def horseshow_value(std):
    GOOD = 30
    BAD = 75
    v = 1.0 if std < GOOD else 2.0 if std < BAD else 4.0
    return v

This works fine. The hard issue is with emulating touching_forhead. One could do it with something like: touching_forehead = int(np.sum(quality) < 1000), and smooth it over 1-2 seconds to discard momentary interferences. but it doesn't work as good as with the Muse SDK (RIP), which most likely used the reference sensor for that, so it reacted very well without the averaging delay. @kowalej - I wonder if those unresolved channel(s) is the reference sensor? (DRL on Fpz). if so, decoding it could be a huge helper!

Thanks.

oori commented

I looked at the data you pulled (unknown-channel-muse-values.txt), here's what I see:
(converted to HEX for readability)

  1. Every line contain a 16bit timestamp + six 24bit values.
  2. When headset in firmly on you (first samples), the values are fixed on all channels, nearly don't change. Mostly 0x7e37c7, with small changes to 0x7e27c7 and 0x7e47c7.
  3. With the second bunch of samples, the values are also mostly fixed, yet different. 0x7dc7c7 or
    0x7dc7c8.
  4. When the headset was off (last samples), the values mildly change, but in orderly sequence. nearly all are 0x___7c8, where ___ moves rather "orderly". towards the end of these last_samples.txt it really looks like an incremental sequence a bit. strange.

It's a bit hard to reach any conclusions from this text file, but:
a. This is clearly not voltage levels.
b. I doubt my initial statement, if these are actually 6 x 24bit, and not 12 x 12bit..

Which Muse did you use, 2016 or 2018? also, I assume no AUX connected.
. I would suggest to try same experiment with putting a finger on each sensor, one by one (including the DRL).

Thanks again.

@oori Thanks, that would be great help!

Muse device doesn't support this, it could be achieved via additional data processing, which the Muse SDK supported. think it would be best to have this type of data quality analysis done in another application though. I do not intend to support.