This project is based on streaming sensor data related to heart and electrodermal activities from audience members and integrating this data into staging elements, such as visual projections, music, and lighting. Thus, the internal states of the audience directly influence the staging. Artists can have a more direct perception of the inner reactions of audience members and can create physical expressions in response to them. Here we share the physiological dataset collected in during the streaming and map heart and electrodermal activity to changes in the staging elements.
Project Website: http://boiling-mind.org/
This is a sample dataset collected from one performance. The complete dataset is consisted of audience multi-modal signals (EDA,BVP, wrist acceleration and angular velocity) over three performances:
In performance1, we have 34 recordings.
In performance2, we have 31 recordings.
In performance3, we have 33 recordings.
- ‘_eda.csv’ : EDA dataset
- ‘_bvp.csv’ : BVP dataset
- ‘_hrt.csv’ : IBI dataset (automatically generated by devices)
- ‘_acc.csv’ : Wrist acceleration dataset
- ‘gyr.csv’: Angular velocity dataset
- ‘sampleCodeEDAextrema.ipynb’: The sample code in python to process EDA dataset(Performance3).
- ‘sampleCodeEDAextrema_JupyterNotebook.pdf’: The PDF version of EDA sample code and example plots.
- ‘sampleCodeHRV.ipynb’: The sample code in python to process BVP dataset(Performance3).
- ‘sampleCodeHRV_JupyterNotebook.pdf’: The PDF version of HRV sample code and example plots.
- localTime : Local timestamp in milliseconds of the recording server at the time of data packet arrival to the server. Each packet of samples (approx 400 ms window) is labeled with the same local time.
- remoteTime : Number of milliseconds passed since the recording device was turned on. Each sample is labeled with the exact time of measurement.
- label : Labels for syncronization with the video recordings.
Other columns are data fields with sensor readings