/Emotion-Recognition

Real-time implementation of emotion recognition using PyTorch and OpenCV

Primary LanguageJupyter NotebookMIT LicenseMIT

Emotion Recognition

A real-time implementation of emotion recognition made with PyTorch and OpenCV. Visit here for more details about the dataset used for the training.

Quick Start

Install

  1. Clone the repository
git clone https://github.com/hash-ir/Emotion-Recognition.git
cd Emotion-Recognition
  1. For running the IPython notebook Emotion Recognition.ipynb, the following tools are required:

An alternative is to make a conda environment from the environment.yaml file included in the repository:

conda env create -f environment.yaml
  1. Once the dependencies are installed, replace the path of haarcascade_frontalface_default.xml with your path. Something like the following:
/home/<username>/anaconda3/lib/python3.x/site-packages/cv2/data/haarcascade_frontalface_default.xml
/home/<username>/anaconda3/lib/python3.x/site-packages/cv2/data/haarcascade_frontalface_default.xml

Usage

  1. For training, download the dataset from here, extract and put fer2013.csv in the root directory.
  2. For testing, execute the first cell, network architecture code cell and last two code cells. Real-time testing requires webcam!

Author(s)

  • Hashir Ahmad - full project - GitHub

License

This work is licensed under the MIT License.