/human-computer-interface

Hand Gesture Recognition Project

Primary LanguageC++MIT LicenseMIT

human-computer-interface

The main aim of this project is to provide a more natural interface. The user interface (UI) of the personal computer has evolved from a text-based command line to a graphical interface with keyboard and mouse inputs. Undoubtedly, human-computer interaction will continue to evolve towards more natural forms of input, like human movement recognition.

This project will provide keyboard & Mouse interface using hand gesture to provide a man-machine interface using a video camera to interpret the American one-handed sign language alphabet and number gestures (plus others for additional keyboard and mouse control). The goal of this project was to create a system to recognize a set of 36 gestures include 26 alphabetic characters (A-Z) and 10 numbers (0-9). We can also control mouse using hand gesture for controlling windows applications such as controlling Media player, Microsoft word, Paint, Games & many other applications. This project aim is to provide touchless interface i.e. Keyboard & Mouse control using hand gesture & also provide help to people with special needs (e.g Paralysis) for controlling computer from distance. The system will use a single, color camera mounted above a neutral colored desk surface next to the computer. The output of the camera will be displayed on the monitor. The user will interact with the system by gesturing in the view of the camera. Shape and position information about the hand will be gathered using webcam.

Use Case:-

  1. User can use this system for controlling different applications using hand alone. No need of keyboard & mouse alone
  2. We can use this system for conversion of sign language of people with special needs into characters
  3. People with specical needs can also use this system for controlling application from distant with less body movements
  4. It can be used to play games just based on hand gesture without need for keyboard & mouse

Objective:

  1. Conversion of one handed sign language into alphabets & numbers.
  2. Mouse control movement according to direction of movement of hand
  3. To perform single click on rotation of hand in left direction.
  4. To perform double click on rotation of hand in right direction.
  5. Control Media Player, Winamp & other application by using hand gesture.
  6. To provide Gaming interface using hand gesture (E. g NFS, Vice City).
  7. To draw different figures or control paint using hand gesture.

Scope:

  1. To provide a keyboard interface through hand gesture. This part includes designing a system which will design and build a man-machine interface using a video camera to interpret the American one-handed sign language into alphabets and number gestures. The goal of this project was to create a system to recognize a set of 36 gestures include 26 alphabetic characters (A-Z) and 10 numbers (0-9).

  2. To provide a mouse control through hand gesture. This part includes providing a mouse control interface using hand gesture. We can control mouse movements just by moving hand in different direction. We can perform left mouse button click & right mouse button click by rotation of hand in different direction.