Below find the information on setup, reference links and creator notes.
This project serves as a complementary work to the report on Facial Emotion Recognition I have written for the A.I. class at SGH. Built web application allows the user to assess the performance of several machine learning models either on preloaded images or video stream via a web camera.
All models have been trained from scratch on the FER+ dataset.
Follow these steps to get a local copy up and running.
-
Make sure that you have Anaconda installed.
-
In order to use OpenCV's pretrained CascadeClassifier as one of the detectors, download
haarcascade_frontalface_default.xml
from OpenCV's GitHub page.
- Clone the repository.
git clone https://github.com/ppawlo97/si-summer-2020.git
- Create separate virtual environment for the project.
conda create --name=si_project python=3.7
- Switch to the created environment.
conda activate si_project
- Install the dependencies.
pip install -r requirements.txt
- Checkout to the
master
branch, if you are not already on it.
git checkout master
- Remember to always switch to the right virtual environment.
conda activate si_project
- Export the path to the pretrained CascadeClassifier as an environmental variable.
export PRETRAINED_CASCLAS=/absolute/path/to/haarcascade_frontalface_default.xml
- Export the main application file as an environmental variable.
export FLASK_APP=fer_app.py
- Run the application from the root directory on localhost.
flask run --port=X.X.X.X
Distributed under the MIT License. See LICENSE
for more information.