/FeelTube

My very first hands on experiment with CV

Primary LanguagePythonMIT LicenseMIT

FeelTube

A simple app that classifies a user's emotion based on facial landmarks!

Running the project

  • The project as of now has 3 running functionalities

  • To run the Emotion Classifier on Jeffa dataset:

cd /path/to/your/directory
python emotrainer.py
  • After which the program will show you a set of 5 images that are chosen at random as test images: alt text

  • Closing that shows us the histogram plotted using LPB: alt text

  • Closing that gives us the output with red text representing wrong prediction and green being right: alt text

  • The terminal has information about the mean accuracy: alt text

  • Sometimes output will have errors : alt text

  • To run the webscraping application for emotion to genre selection:

cd /path/to/your/directory
python emovie.py
  • To run the face and features detector: This uses the 68 landmarks method: alt text
cd /path/to/your/directory
python facial_landmarks.py --shape-predictor shape_predictor_68_face_landmarks.dat --image images/<imagename>.jpg