/virtual-painting-app

This repository contains a virtual painting model that uses a combination of OpenCV, Mediapipe, and a custom machine learning model trained on a dataset of hand gestures. The model, implemented in Python, leverages TensorFlow for the machine learning components and Mediapipe for hand tracking. The user interface, built with PyQt.

Primary LanguagePython

virtual-painting-app

This repository contains a virtual painting model that uses a combination of OpenCV, Mediapipe, and a custom machine learning model trained on a dataset of hand gestures. The model, implemented in Python, leverages TensorFlow for the machine learning components and Mediapipe for hand tracking. The user interface, built with PyQt, allows the user to interact with the model and create their own virtual paintings by controlling the brush size, color, and other aspects using hand gestures.

The machine learning model was trained on a dataset of over 10,000 hand gestures, achieving an accuracy of 95% on the test set. This project demonstrates the potential of combining computer vision and machine learning techniques to enable natural and intuitive user interaction with virtual applications. We hope that this model can serve as a useful resource for those interested in exploring similar projects