In this project, I have applied transfer learning to an Inception-v3 Model to recognize Alphabets and Numbers in American Sign Language with 99% accuracy on the test set. You can find the training notebook here.
To implement this code, you can -
-
Download the model here.
-
I have deployed the app using StreamLit, where you can directly upload a picture and you'll get the results. To check by uploading images run -
streamlit run main.py
-
I have also implemened the real-time ASL detection using Google's mediapipe, which assists in hand detection. To see live ASL recoginition run -
python live.py
- https://google.github.io/mediapipe/
- https://streamlit.io/
- https://github.com/loicmarie/sign-language-alphabet-recognizer/archive/refs/heads/master.zip
- https://discuss.pytorch.org/t/using-imagefolder-random-split-with-multiple-transforms/79899/4
- https://www.kaggle.com/leifuer/intro-to-pytorch-loading-image-data