Hand Gesture Recognition Model

Overview

This repository contains code to develop a hand gesture recognition model that can accurately identify and classify different hand gestures from image or video data. The goal is to enable intuitive human-computer interaction and gesture-based control systems.

The model is built using deep learning techniques and trained on the LeapGestureRecog dataset, which can be found at "https://www.kaggle.com/datasets/gti-upm/leapgestrecog". This dataset consists of images of hand gestures captured using the Leap Motion Controller.

Requirements

To run the code in this repository, you will need the following dependencies:

Python (>=3.6)

TensorFlow (>=2.0)

Keras (>=2.3)

NumPy

Matplotlib

OpenCV

Acknowledgments

LeapGestureRecog dataset contributors

Kaggle for hosting the dataset