Gesture Recognition using ASL (American Sign Language) is a project designed for recognizing hand gestures and interpreting them as letters from the American Sign Language alphabet. Leveraging Convolutional Neural Networks (CNN), this application aims to bridge communication gaps for individuals who use sign language.
- CNN Model: Utilizes a Convolutional Neural Network for accurate gesture recognition 🧠
- Training and Evaluation: The model is trained on a sign language dataset and evaluated for accuracy 📈
- Visualization: Confusion matrix and classification report provide insights into the model's performance 📊
- User Interaction: Enables users to interact with the system through gestures 🤖
- Python
- TensorFlow
- Pandas
- Matplotlib
- Seaborn
- Scikit-learn
-
Clone the repository:
git clone https://github.com/charvijain12/SignSpeakAI.git
-
Install dependencies:
pip install -r requirements.txt
- Load the ASL gesture dataset.
- Train the CNN model using the provided script.
- Evaluate the model's performance.
- Interact with the system using hand gestures.
The CNN model achieves an accuracy of 94% on the test set, providing reliable recognition of American Sign Language gestures.
Contributions are welcome! Feel free to enhance the project, add new features, or improve existing ones. Follow the guidelines in the Contributing file.
This project is licensed under the MIT License.