/UCI-Human-Activity-Recognition

Human Activity Recognition Project on UCI-HAR dataset. This model predicts human activities such as Walking, Walking_Upstairs, Walking_Downstairs, Sitting, Standing or Laying. This dataset is collected from 30 persons, performing different activities with a smartphone to their waists

Primary LanguageJupyter Notebook

UCI-Human-Activity-Recognition

Human Activity Recognition (HAR) using UCI dataset. Classifying the type of movement amongst six categories:

  • WALKING,
  • WALKING_UPSTAIRS,
  • WALKING_DOWNSTAIRS,
  • SITTING,
  • STANDING,
  • LAYING.

Details about the input data

  • dataset is collected from 30 persons(referred as subjects in this dataset), performing different activities with a smartphone to their waists. The data is recorded with the help of sensors (accelerometer and Gyroscope) in that smartphone. This experiment was video recorded to label the data manually.
  • The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window).
  • The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity.
  • The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used
  • Further details about the dataset can be found on this file

Project details

updated soon