Lum1104 / MER-FactoryLinks
π Pre-process, annotate, evaluate, and train your Affect Computing (e.g., Multimodal Emotion Recognition, Sentiment Analysis) datasets ALL within MER-Factory! (LangGraph Based Agent Workflow)
β68Updated 2 weeks ago
Alternatives and similar repositories for MER-Factory
Users that are interested in MER-Factory are comparing it to the libraries listed below
Sorting:
- Explainable Multimodal Emotion Reasoning (EMER), OV-MER οΌICML), and AffectGPT οΌICML, Oral)β302Updated 4 months ago
- β26Updated 7 months ago
- Toolkits for Multimodal Emotion Recognitionβ266Updated 6 months ago
- GPT-4V with Emotionβ97Updated 2 years ago
- [CVPR 2023] Code for "Learning Emotion Representations from Verbal and Nonverbal Communication"β54Updated 9 months ago
- [CVPR 2024] EmoVIT: Revolutionizing Emotion Insights with Visual Instruction Tuningβ38Updated 7 months ago
- β16Updated 5 months ago
- This is the official implementation of 2023 ICCV paper "EmoSet: A large-scale visual emotion dataset with rich attributes".β60Updated last year
- MIntRec2.0 is the first large-scale dataset for multimodal intent recognition and out-of-scope detection in multi-party conversations (ICβ¦β69Updated 3 months ago
- This paper presents our winning submission to Subtask 2 of SemEval 2024 Task 3 on multimodal emotion cause analysis in conversations.β24Updated last year
- β23Updated last year
- MIntRec: A New Dataset for Multimodal Intent Recognition (ACM MM 2022)β122Updated 7 months ago
- β21Updated 10 months ago
- Emotion-LLaMA: Multimodal Emotion Recognition and Reasoning with Instruction Tuningβ476Updated 3 weeks ago
- This repository provides the codes for MMA-DFER: multimodal (audiovisual) emotion recognition method. This is an official implementation β¦β48Updated last year
- EmoLLM: Multimodal Emotional Understanding Meets Large Language Modelsβ19Updated last year
- β24Updated 2 years ago
- TCL-MAP is a powerful method for multimodal intent recognition (AAAI 2024)β53Updated last year
- β20Updated last year
- A Facial Expression-Aware Multimodal Multi-task Learning Framework for Emotion Recognition in Multi-party Conversations (ACL 2023)β73Updated last year
- GCNet, official pytorch implementation of our paper "GCNet: Graph Completion Network for Incomplete Multimodal Learning in Conversation"β92Updated 7 months ago
- av-SALMONN: Speech-Enhanced Audio-Visual Large Language Modelsβ13Updated last year
- A Unimodal Valence-Arousal Driven Contrastive Learning Framework for Multimodal Multi-Label Emotion Recognition (ACM MM 2024 oral)β25Updated last year
- [Information Fusion 2024] HiCMAE: Hierarchical Contrastive Masked Autoencoder for Self-Supervised Audio-Visual Emotion Recognitionβ117Updated 3 months ago
- [ACM ICMR'25]Official repository for "eMotions: A Large-Scale Dataset for Emotion Recognition in Short Videos"β36Updated 4 months ago
- Pytorch implementation for codes in Noise Imitation Based Adversarial Training for Robust Multimodal Sentiment Analysis (Accepted by IEEEβ¦β14Updated last year
- β23Updated 7 months ago
- An official implementation of "Incomplete Multimodality-Diffused Emotion Recognition" in PyTorch. (NeurIPS 2023)β59Updated 2 years ago
- The repo for "On-the-fly Modulation for Balanced Multimodal Learning", T-PAMI 2024β18Updated last year
- [EMNLP2023] Conversation Understanding using Relational Temporal Graph Neural Networks with Auxiliary Cross-Modality Interactionβ60Updated last year