/Human_Activity_Recognition

My attempts at human activity recognition using CASAS smart home sensor data.

Primary LanguageJupyter Notebook

Human Activity Recognition


The Problem:


Smart home technology has become increasingly common during the crux of the information age. Given the diligent protection of privacy, data collected from a smart home can assist disabled people with their day-to-day tasks. Such assistance has already become accessible with, for instance, reminders on your phone. But imagine your entire house acting as a sort of butler for you. When you get dressed, your closet suggests outfits you haven’t worn in a while. When you take your medication, your pill dispenser automatically gives you the pills you need and orders new prescriptions when necessary. When you’re cooking, your fridge suggests new recipes to try based on the food in your fridge, and your stovetop automatically turns off when you’ve left home. All of this may seem like an intrusion of privacy, but for people struggling with day to day tasks such as cooking, dressing, and taking your medication, It could mean the difference between living the rest of their life in a nursing home or the comfort of their own home.

The Solution; One Bite of the Elephant:


It is necessary to build a model that can classify what a resident is doing in their smart home before developing technology to help residents therein. This project focuses on just that. Using the CASAS assisted living dataset to build a supervised model that can classify human activity based on smart home sensors.

Table of Contents


CASAS Dataset was retrieved from the UCI Machine Learning Repository.

Enjoy!

Here's example of the sensor layout of one of the 30 houses. Sensor Map Layout