A project which detects a person's face and predicts his/her emotions using OpenCV and Deep Learning.
The dataset used in making this project is FER-2013 dataset. The data consists of 48x48 pixel grayscale images of faces. The faces have been automatically registered so that the face is more or less centred and occupies about the same amount of space in each image.
The dataset is categorized into seven categories based on the facial expressions (Angry, Disgust, Fear, Happy, Sad, Surprise, Neutral). The training set consists of 28,709 examples and the public test set consists of 3,589 examples.
The dataset can be accessed by clicking here.
- Clone the project to you local machine
git clone git@github.com:archihalder/EmoDet.git
- Enter the directory
cd EmoDet
- Get the required modules to run
pip install -r requirements.txt
- Enter src directory
cd src
- Run the file
python3 video.py