This repository contains a Python script for real-time face detection, emotion analysis, and interest determination using OpenCV and DeepFace. The script captures video from a webcam, detects faces, analyzes their emotions, and displays whether the person is interested based on their emotional state and confidence level.
To run the script, you need to install the following libraries:
- OpenCV
- DeepFace
- NumPy
You can install these dependencies using pip:
pip install opencv-python deepface numpy
The script captures video frames from the default webcam, detects faces using the Haar Cascade classifier, and analyzes the emotions of the detected faces using the DeepFace library. It also determines if the person is interested based on the dominant emotion and its confidence level.
The script uses a predefined set of colors to highlight different emotions:
- Angry: Red
- Disgust: Yellow
- Fear: Blue
- Happy: Green
- Sad: Cyan
- Surprise: Magenta
- Neutral: White
The script considers a person interested if their dominant emotion is 'happy', 'surprise', or 'neutral' with a confidence level above 70%. Otherwise, it considers the person not interested.
- Clone the repository:
git clone https://github.com/codewithdark-git/InterestAnalyzer.git
cd InterestAnalyzer
- Run the script:
python emotion_detection.py
- Press 'q' to exit the video feed.
emotion_detection.py
: The main script for emotion detection and interest analysis.README.md
: This documentation file.
This project can be used for various applications, such as:
- Real-time emotion monitoring
- Enhancing user experience by adapting to emotions
- Interest detection in video calls or presentations
This project is licensed under the MIT License. See the LICENSE file for details.
Contributions are welcome! Please open an issue or submit a pull request for any changes or improvements.
This project was inspired by a memorable classroom experience. In a lively database class with a strict but perceptive teacher, it was common to witness a mix of strict discipline and spontaneous humor.
One day, the teacher remarked on their ability to read our emotions from our expressions and behaviors. This sparked an idea: what if we could create a system that uses machine learning to read students' faces and assess their interest levels in real-time? Such a system could help identify who is engaged and who isn't, making it a valuable tool for educators.
This project aims to bridge technology and education, enhancing the learning experience by providing real-time feedback on student engagement. This tool can help teachers tailor their methods to better suit the needs of their students, creating a more responsive and effective learning environment.