IT 499: Major Project (Bachelor Thesis)

Leveraging Multimodal Behavioral Analytics for Automated Job Interview Performance Assessment and Feedback.

We propose a multimodal analytical framework that analyzes the candidate in an interview scenario and provides feedback for predefined labels such as engagement, speaking rate, eye contact, etc. We perform a comprehensive analysis that includes the interviewee’s facial expressions, speech, and prosodic information,using the video, audio, and text transcripts obtained from the recorded interview. We use these multimodal data sources to construct a composite representation, which is used for training machine learning classifiers to predict the class labels. Such analysis is then used to provide constructive feedback to the interviewee for their behavioral cues and body language.

We use the MIT Interview Dataset for our experiments: https://roc-hci.com/past-projects/automated-prediction-of-job-interview-performances/. More information on how to run the code for each of the modalities can be found in the README.md in the respective folders.

For more details on our experiments, please read our paper: https://www.aclweb.org/anthology/2020.challengehml-1.6.pdf

Contributors: Selvan Sunitha Ravi,Anumeha Agrawal, Rosa Anil George
Project Guide: Dr. Sowmya Kamath, Dr. Anand Kumar