Gesture Based Virtual Mouse   platform

Gesture Controlled Virtual Mouse makes human computer interaction simple by making use of Hand Gestures. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures. This project makes use of the state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works direct on hands by making use of MediaPipe Hand detection. Currently it works on Windows platform.

Note: Use Python version: 3.8.5

Features

click on dropdown to know more

Gesture Recognition:

Neutral Gesture Palm
Neutral Gesture. Used to halt/stop execution of current gesture.
Move Cursor Move Cursor
Cursor is assigned to the midpoint of index and middle fingertips. This gesture moves the cursor to the desired location. Speed of the cursor movement is proportional to the speed of hand.
Right Click Right Click
Gesture for single right click
Left Click Left Click
Gesture for single left click
Scrolling Scrolling
Dynamic Gestures for horizontal and vertical scroll. The speed of scroll is proportional to the distance moved by pinch gesture from start point. Vertical and Horizontal scrolls are controlled by vertical and horizontal pinch movements respectively.
Drag and Drop Drag and Drop
Gesture for drag and drop functionality. Can be used to move/tranfer files from one directory to other.

Getting Started

Pre-requisites

Python: (3.6 - 3.8.5)
Anaconda Distribution: To download click here.

Procedure

git clone https://github.com/xenon-19/Gesture-Controlled-Virtual-Mouse.git

For detailed information about cloning visit here.

Step 1:

conda create --name gest python=3.8.5

Step 2:

conda activate gest

Step 3:

pip install -r requirements.txt

Step 5:

python Gesture_Dataset.py

It asks for gesture name and captures the gesture to make dataset.

Step 6:

Open the Model_creation.ipynb in Google colab as the code isn't supported in Windows right now and run all the code blocks and download the task file and store in the Model directory.

Step 7:

python Gesture_Controller.py