Project presentation at "https://youtu.be/mYM9DRypGQk"
"IEMOCAP" is the input data which can be requested from University of Southern California. Due to licensing we cannot share the data.
Relevant paper for "IEMOCAP" input data is: C. L. A. K. E. M. S. K. J. C. S. L. C. Busso, M. Bulut and S. Narayanan. Iemocap: Interactive emotional dyadic motion capture database, December 2008
For preprocessing of data provide the "path to input data" in "DATA_PREPROCESSING.ipynb" jupyter notebook.
//input path iemocap_full_release_path = "/home/mandeep_stanford/cs231n_project/IEMOCAP_full_release/" //output path iemocap_pre_processed_data_path = "/home/mandeep_stanford/cs231n_project/code/pre-processed_data/"
Provide the path to the input data in the code and the code will execute. You will have to maintain directory structure & file names for the training code to work.
Cite this work as:
M. Singh and Y. Fang, “Emotion recognition in audio and video using deep neural networks,” arXiv, vol. abs/2006.08129, 2020.