/Human-Activity-Recognition

Classify human activities from smartphones sensors and actuators, mainly walking, walking upstairs, walking downstairs, sitting, standing, and laying.

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

Human Activity Recognition with Smartphones

If you loved my work, smash that ⭐ button.

Activity Image

We will be working on the Human Activity Recognition with Smartphones database, it has been built using the recordings of study participants performing activities of daily living (ADL) while carrying a smartphone with an embedded inertial sensors. The objective is to classify activities into one of the six activities (walking, walking upstairs, walking downstairs, sitting, standing, and laying) performed.

The dataset consists of :

  • Triaxial acceleration from the accelerometer (total acceleration) and the estimated body acceleration.
  • Triaxial Angular velocity from the gyroscope.
  • 561-feature vector with time and frequency domain variables.
  • Activity label.