AvaMERG / AvaMERG-PipelineLinks
☆13Updated 4 months ago
Alternatives and similar repositories for AvaMERG-Pipeline
Users that are interested in AvaMERG-Pipeline are comparing it to the libraries listed below
Sorting:
- Explainable Multimodal Emotion Reasoning (EMER), OV-MER (ICML), and AffectGPT (ICML, Oral)☆260Updated 2 months ago
- The code for the paper "ECR-Chain: Advancing Generative Language Models to Better Emotion Cause Reasoners through Reasoning Chains" (IJCA…☆12Updated last year
- This repository hosts the code, data and model weight of PanoSent.☆55Updated 3 months ago
- ☆13Updated last year
- M3ED: Multi-modal Multi-scene Multi-label Emotional Dialogue Database. ACL 2022☆115Updated 3 years ago
- ☆94Updated 4 months ago
- Toolkits for Multimodal Emotion Recognition☆253Updated 5 months ago
- Synth-Empathy: Towards High-Quality Synthetic Empathy Data☆16Updated 7 months ago
- ☆57Updated last year
- This is the repository of our ACL 2024 paper "ESCoT: Towards Interpretable Emotional Support Dialogue Systems".☆32Updated 5 months ago
- Emotion-LLaMA: Multimodal Emotion Recognition and Reasoning with Instruction Tuning☆428Updated last week
- The datasets for image emotion computing☆37Updated 3 years ago
- Pytorch implementation for codes in Noise Imitation Based Adversarial Training for Robust Multimodal Sentiment Analysis (Accepted by IEEE…☆14Updated last year
- The offical realization of InstructERC☆144Updated 4 months ago
- MIntRec2.0 is the first large-scale dataset for multimodal intent recognition and out-of-scope detection in multi-party conversations (IC…☆63Updated 2 months ago
- ☆15Updated last year
- ☆197Updated last month
- [ICML'25 Spotlight] Catch Your Emotion: Sharpening Emotion Perception in Multimodal Large Language Models☆36Updated last month
- MIntRec: A New Dataset for Multimodal Intent Recognition (ACM MM 2022)☆114Updated 5 months ago
- This paper presents our winning submission to Subtask 2 of SemEval 2024 Task 3 on multimodal emotion cause analysis in conversations.☆23Updated last year
- EmoLLM: Multimodal Emotional Understanding Meets Large Language Models☆18Updated last year
- ☆18Updated 3 months ago
- A Facial Expression-Aware Multimodal Multi-task Learning Framework for Emotion Recognition in Multi-party Conversations (ACL 2023)☆70Updated 11 months ago
- A Unimodal Valence-Arousal Driven Contrastive Learning Framework for Multimodal Multi-Label Emotion Recognition (ACM MM 2024 oral)☆25Updated 11 months ago
- [EMNLP 2024] ”ESC-Eval: Evaluating Emotion Support Conversations in Large Language Models“☆20Updated last year
- ☆24Updated 5 months ago
- HumanOmni☆196Updated 7 months ago
- Pytorch implementation for Tailor Versatile Multi-modal Learning for Multi-label Emotion Recognition☆63Updated 2 years ago
- ☆21Updated 4 months ago
- A comprehensive overview of affective computing research in the era of large language models (LLMs).☆26Updated last year