Real time American Sign Language Recognition
preprocessing_and_training.ipynb contains the preporcessing and model and the real_time.ipynb is for real time Real time American Sign Language Recognition the third file is the trained file obtained from training the model on Gpu
Python3
Tensorflow
Keras
opencv
Matplotlib
Cuda 9.0
In this project I have used the kaggle American Language Recognition dataset.The model takes live video from the webcam and predicts the alphabet based on the hand gesture made by the user using a Convolutional Neural Network . There a total of 24 classes in the dataset.
The user has to put his hand inside the green box which is the region of interest and make the gesture the model predicts the alphabet made by the user
CONV2D->RELU->MAXPOOLING->CONV2D->RELU->MAXPOOLING->DROPOUT->CONV2D->RELU->MAXPOOLING->DROPOUT->FLATTEN->DENSE->DROPOUT-> DENSE->SOFTMAX
The trained model can be downloaded from this link model
Dataset link https://www.kaggle.com/datamunge/sign-language-mnist