/gesture_recognition_iiitb_upgrad

Recognising 5 different hand gestures to control a smart TV

Primary LanguageJupyter Notebook

Gesture Recognition: Case study IIITB & Upgrad

Recognising 5 different hand gestures to control a smart TV

We need to develop a cool feature in the smart-TV that can recognise five different gestures performed by the user which will help users control the TV without using a remote.

The gestures are continuously monitored by the webcam mounted on the TV. Each gesture corresponds to a specific command:

  • Thumbs up: Increase the volume
  • Thumbs down: Decrease the volume
  • Left swipe: 'Jump' backwards 10 seconds
  • Right swipe: 'Jump' forward 10 seconds
  • Stop: Pause the movie

The training data consists of a few hundred videos categorised into one of the five classes. Each video (typically 2-3 seconds long) is divided into a sequence of 30 frames(images). These videos have been recorded by various people performing one of the five gestures in front of a webcam - similar to what the smart TV will use.

Dataset : https://drive.google.com/uc?id=1ehyrYBQ5rbQQe6yL4XbLWe3FMvuVUGiL

Contributers: