valence-arousal
There are 17 repositories under valence-arousal topic.
face-analysis/emonet
Official implementation of the paper "Estimation of continuous valence and arousal levels from faces in naturalistic conditions", Antoine Toisoul, Jean Kossaifi, Adrian Bulat, Georgios Tzimiropoulos and Maja Pantic, Nature Machine Intelligence, 2021
serkansulun/midi-emotion
Generates multi-instrument symbolic music (MIDI), based on user-provided emotions from valence-arousal plane.
kdhht2334/ELIM_FER
[NeurIPS 2022] The official repository of Expression Learning with Identity Matching for Facial Expression Recognition
ValentinRicher/emotion-recognition-GAN
This project is a semi-supervised approach to detect emotions on faces in-the-wild using GAN
kdhht2334/AVCE_FER
[ECCV2022] The official repository of Emotion-aware Multi-view Contrastive Learning for Facial Emotion Recognition
edukhnai/valence-arousal-recognition
Emotion recognition with Keras library. Uses AffectNet dataset and valence-arousal labels. Implements CNN architecture with regression
sid230798/Facial-emotion-Recognition
This Repo consist code for transfer learning for facial emotion detection via valence and arousal levels. We used pretrained weights from VGG-16 net and apply on that features deep neural network and lstm model in pytorch. We tested our model on Aff-wild net dataset.
kdhht2334/Contrastive-Adversarial-Learning-FER
[AAAI2021] A repository of Contrastive Adversarial Learning for Person-independent FER
extnike/face_emotion_recognition
Project to find one of 9 pre-trained emotions on given photo, video or webcam-stream
vishaal27/LeVAsa
Code for "It’s LeVAsa not LevioSA! Latent Encodings for Valence-Arousal Structure Alignment" [CoDS-CoMAD'21]
kdhht2334/Hidden_Emotion_Detection_using_MM_Signals
[CHI2021] Hidden emotion detection using multi-modal signals
Khoality-dev/The-Expression
An Emotional Game for Social Expression Learning
lugrenl/Emotion-Recognition_model
Experiments with convolutional neural networks for emotion recognition
EltonCN/Emotion-estimation-applied-to-storytelling-reaction
Experiment to better understand how to use emotion analysis to understand a person's emotional reaction to an audiovisual work
ryselgh/in-mca_au-va_ard-arima
Search for a dependency between "annemo" annotations of valence and arousal from RECOLA and Action Units from OpenFace using ARD regression and ARIMA.
saeu5407/hsemotion-onnx
hsemotion의 ONNX 구현입니다.
NUSTM/UniVA
Code for paper "A Unimodal Valence-Arousal Driven Contrastive Learning Framework for Multimodal Multi-Label Emotion Recognition"