yuntaoshou / Awesome-Emotion-ReasoningLinks
Awesome-Emotion-Reasoning is a collection of Emotion-Reasoning works, including papers, codes and datasets
☆65Updated last month
Alternatives and similar repositories for Awesome-Emotion-Reasoning
Users that are interested in Awesome-Emotion-Reasoning are comparing it to the libraries listed below
Sorting:
- EmoLLM: Multimodal Emotional Understanding Meets Large Language Models☆19Updated last year
- Released code and datas for「Multi-modal Stance Detection: New Datasets and Model」in ACL2024.☆27Updated last year
- [Findings of NAACL 2024] Emotion-Anchored Contrastive Learning Framework for Emotion Recognition in Conversation☆37Updated last year
- UniSA: Unified Generative Framework for Sentiment Analysis☆52Updated last year
- [ICML'25 Spotlight] Catch Your Emotion: Sharpening Emotion Perception in Multimodal Large Language Models☆42Updated 2 weeks ago
- MIntRec: A New Dataset for Multimodal Intent Recognition (ACM MM 2022)☆122Updated 7 months ago
- A comprehensive overview of affective computing research in the era of large language models (LLMs).☆27Updated last year
- Official implementation of Towards Multi-Modal Sarcasm Detection via Hierarchical Congruity Modeling with Knowledge Enhancement.☆41Updated last year
- Official repository for the paper “MME-Emotion: A Holistic Evaluation Benchmark for Emotional Intelligence in Multimodal Large Language M…☆16Updated 3 months ago
- NAACL 2022 paper on Analyzing Modality Robustness in Multimodal Sentiment Analysis☆31Updated 2 years ago
- This repository hosts the code, data and model weight of PanoSent.☆56Updated 5 months ago
- ACM MM '22: Unified Multi-modal Pre-training for Few-shot Sentiment Analysis with Prompt-based Learning☆18Updated 2 years ago
- GPT-4V with Emotion☆97Updated 2 years ago
- A Facial Expression-Aware Multimodal Multi-task Learning Framework for Emotion Recognition in Multi-party Conversations (ACL 2023)☆73Updated last year
- A curated list of works related to Misinformation Video Detection, as a companion material for an ACM Multimedia 2023 survey☆125Updated 2 months ago
- Official implementation of Dynamic Routing Transformer Network for Multimodal Sarcasm Detection (ACL'23)☆35Updated 2 years ago
- [Paperlist] Awesome paper list of multimodal dialog, including methods, datasets and metrics☆37Updated 10 months ago
- ☆24Updated 2 years ago
- ☆23Updated 7 months ago
- [ACM MM 2021 Oral] Exploiting BERT For Multimodal Target Sentiment Classification Through Input Space Translation"☆40Updated 4 years ago
- code for "Supervised Prototypical Contrastive Learning for Emotion Recognition in Conversation, EMNLP 22"☆80Updated 2 years ago
- Code and model for AAAI 2024: UMIE: Unified Multimodal Information Extraction with Instruction Tuning☆45Updated last year
- [ACM MM 2022 Oral] This is the official implementation of "SER30K: A Large-Scale Dataset for Sticker Emotion Recognition"☆29Updated 3 years ago
- The repository for ACL 2023 paper "Supervised Adversarial Contrastive Learning for Emotion Recognition in Conversations", and SemEval 202…☆36Updated 2 years ago
- Generated Multimodal Prompt☆22Updated 2 years ago
- Code and dataset of "MEmoR: A Dataset for Multimodal Emotion Reasoning in Videos" in MM'20.☆55Updated 2 years ago
- ☆23Updated 4 years ago
- Implementation of our ACL2023 paper: Unifying Cross-Lingual and Cross-Modal Modeling Towards Weakly Supervised Multilingual Vision-Langua…☆19Updated 2 years ago
- This repository contains code to evaluate various multimodal large language models using different instructions across multiple multimoda…☆30Updated 9 months ago
- TCL-MAP is a powerful method for multimodal intent recognition (AAAI 2024)☆53Updated last year