/SignLSTM

Real-Time Sign Language Detection using LSTM

Primary LanguageJupyter Notebook

SignLSTM

Real-Time Sign Language Detection using LSTM

SignLSTM

This project implements a real-time sign language detection system using LSTM Keras, Mediapipe, and Keypoints using MP Holistic. The system uses a deep learning model to classify American Sign Language (ASL) gestures in real-time from video input. The model is trained on a dataset of ASL gestures and achieves a high level of accuracy in classification.

Usage

  1. Run the main.ipynb
  2. Hold your hand in front of the camera and form an ASL gesture.
  3. The program will display the detected gesture and its classification.

Notes

  • If you don't want to capture the frames yourself for training. You can use the dataset under MP_Data.

P.S.

Special thanks to my brother for recording the GIF :)