There aren’t many transcription services available, especially for transcribing sign language. If you wanted to improve your sign language by attempting to process actual sign language users, it may be inefficient as there isn’t something immediately available to compare it to and have it register in your brain that this hand gesture correlates with this word.
We have created a webapp that takes live images of sign language symbols and tries to read and predict the letter being displayed
- Install required packages from requirements.txt
- Run app.py and try to put hand in red box
- Install pytorch with cuda
- Install required packages from requirements.txt
- Open model_train.py and set model_file_name
- Run model_train.py
The presentation can be found here
Dataset provided from this competition on Kaggle
Training model adapted from Vijay Vignesh P