/hand-parser

Implementation of media pipe, and using finger state to predict hand gesture

Primary LanguagePythonApache License 2.0Apache-2.0

hand-parser

Introduction

This is an implementation of mediapipe in python and using finger state to predict hand gestures. You can read the detail about the approach used in this project in here. The difference is that I don't handle the finger state by using a trained network, I just do some math calculating.

Dependencies

opencv-python >= 4.0

TensorFlow2.0(GPU is unnecessary)

PyTorch >= 1.1

Numpy

Pillow

Usage

  1. Install required dependencies.
  2. run python app.py
  3. run python app-mouse.py to control mouse with your index finger. Use action "2" to click. Due to this project doesn't allow GPU, and low FPS, I'll stop developing this function

Custom

  • You can base on the points detected by mediapipe to predict or config the label

    e.g: The "CATCH" label is predicted using the distance between the index and the thumb finger while the others using the angle to be predicted.

  • Read more in utils/hand_track_utils.py and the paper(https://github.com/Prasad9/Classify-HandGesturePose), you will make it clear soon.

Demo

Demo

License

Apache License 2.0