Human Activity Recognition using smartphone dataset
Jupyter Notebook
Human_Activity_Recognition
UCI HAR dataset
The pre-processing steps included:
Pre-processing accelerometer and gyroscope using noise filters. Sensor Data is captured at
frequency of 50 Hz.
Splitting data into fixed windows of 2.56 seconds (128 data points) with 50% overlap.Splitting
of accelerometer data into gravitational (total) and body motion components.
A number of time and frequency features commonly used in the field of human activity
recognition were extracted from each window. The result was a 561 element vector of
features.
The dataset was split into train (70%) and test (30%) sets based on data for subjects, e.g. 21
subjects for train and nine for test.
Own dataset - Matlab android application
ConvLSTM approach
CNN ->read subsequences of the main sequences in block->extract feature from each
block
LSTM->interpret the features extracted from each block.
Input:
Samples: n, for the number of windows in the dataset.
Time: 4, for the four subsequences that we split a window of 128 time steps into.
Rows: 1, for the one-dimensional shape of each subsequence.
Columns: 32, for the 32 time steps in an input subsequence.