/hand-read

Sign Language Translator for Alphabetical Characters 1.0

Primary LanguagePython

Hand-Gesture-Reader

What does this code do?

  • Reading simple idle hand gestures in Sign Language such as alphabetical symbols using a pre-trained model.
    (I trained almost all gestures except for Z and J since they require motions)

IMG_2648

How does it work?

  • Well fairly easy since I have processed all input datas on my computer
  • You just have to put your hand in your camera. (Remember one hand only)

Manual guide how to run the codes:

Installing libraries and dependencies:

pip3 install open-cv tensorflow mediapipe numpy
  • Since I have trained model and converted images into readable form in model.p. (You do not have to train it manually)

Head toward zsh, cmd, or command prompt, terminal and run the code:

python3 inference.py


It should pop up a window like below and here comes the demonstration:
Note: The model still sometimes have low inaccuracy and some input overload so try again if it exits unfortunately (And it will do)