lixin2002cn's Stars
rugvedmhatre/Multimodal-Sentiment-Analysis
This project aims to develop a robust multi-modal sentiment analysis system that integrates visual cues from images with textual data to provide a more comprehensive understanding of human emotions.
yyh-rain-song/ReMamber
ECCV24 "ReMamber: Referring Image Segmentation with Mamba Twister" official repository.
leson502/CORECT_EMNLP2023
[EMNLP2023] Conversation Understanding using Relational Temporal Graph Neural Networks with Auxiliary Cross-Modality Interaction
pliang279/awesome-multimodal-ml
Reading list for research topics in multimodal machine learning
tangjyan/zh-cn
中文版学术主页
caozhong14/cross-modal
Justin1904/Low-rank-Multimodal-Fusion
This is the repository for "Efficient Low-rank Multimodal Fusion with Modality-Specific Factors", Liu and Shen, et. al. ACL 2018
Justin1904/TensorFusionNetworks
Pytorch Implementation of Tensor Fusion Networks for multimodal sentiment analysis.
thuiar/Self-MM
Codes for paper "Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment Analysis"
WasifurRahman/BERT_multimodal_transformer
ys-zong/awesome-self-supervised-multimodal-learning
[T-PAMI] A curated list of self-supervised multimodal learning resources.
vvnzhang/TensorFusionNetwork-CMU_MOSI
Implement TFN on CMU-MOSI.
Jie-Xie/CMU-MultimodalDataSDK
MultimodalSDK provides tools to easily apply machine learning algorithms on well-known affective computing datasets such as CMU-MOSI, CMU-MOSI2, POM, and ICT-MMMO.
pakoromilas/MultimodalSDK_loader
Data parser for the CMU-MultimodalSDK package including parsing for CMU-MOSEI, CMU-MOSI, and POM datasets
Ighina/MultiModalSA
MultiModal Sentiment Analysis architectures for CMU-MOSEI.
pliang279/MFN
[AAAI 2018] Memory Fusion Network for Multi-view Sequential Learning
Vincent-ZHQ/DMER
A survey of deep multimodal emotion recognition.
liyunfan1223/multimodal-sentiment-analysis
该仓库存放了多模态情感分析实验的配套代码。
declare-lab/conv-emotion
This repo contains implementation of different architectures for emotion recognition in conversations.
kyegomez/MultiModalMamba
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
declare-lab/multimodal-deep-learning
This repository contains various models targetting multimodal representation learning, multimodal fusion for downstream tasks such as multimodal sentiment analysis.
zehuiwu/MMML
Multi-Modality Multi-Loss Fusion Network
krahets/hello-algo
《Hello 算法》:动画图解、一键运行的数据结构与算法教程。支持 Python, Java, C++, C, C#, JS, Go, Swift, Rust, Ruby, Kotlin, TS, Dart 代码。简体版和繁体版同步更新,English version ongoing
thuiar/AWESOME-MSA
Paper List for Multimodal Sentiment Analysis
thuiar/MMSA
MMSA is a unified framework for Multimodal Sentiment Analysis.
duliaojason/Multimodal-emotion-classification
An experiment multimodal-emotion-classification
declare-lab/MELD
MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversation
Robin-WZQ/multimodal-emotion-recognition-DEMO
A demo for multi-modal emotion recognition.(多模态情感识别demo)
Tengfei000/MPED
MPED: A Multi-Modal Physiological Emotion Database for Discrete Emotion
Samarth-Tripathi/IEMOCAP-Emotion-Detection
Multi-modal Emotion detection from IEMOCAP on Speech, Text, Motion-Capture Data using Neural Nets.