lizaijing / UniSALinks
UniSA: Unified Generative Framework for Sentiment Analysis
☆54Updated last year
Alternatives and similar repositories for UniSA
Users that are interested in UniSA are comparing it to the libraries listed below
Sorting:
- A Facial Expression-Aware Multimodal Multi-task Learning Framework for Emotion Recognition in Multi-party Conversations (ACL 2023)☆69Updated 10 months ago
- ☆198Updated 2 weeks ago
- MIntRec: A New Dataset for Multimodal Intent Recognition (ACM MM 2022)☆107Updated 4 months ago
- M-SENA: All-in-One Platform for Multimodal Sentiment Analysis☆93Updated 3 years ago
- MultiEMO: An Attention-Based Correlation-Aware Multimodal Fusion Framework for Emotion Recognition in Conversations (ACL 2023)☆79Updated last year
- Learning Language-guided Adaptive Hyper-modality Representation for Multimodal Sentiment Analysis (ALMT)☆125Updated 5 months ago
- ☆24Updated 2 years ago
- Pytorch implementation for Tailor Versatile Multi-modal Learning for Multi-label Emotion Recognition☆62Updated 2 years ago
- Paper List for Multimodal Sentiment Analysis☆101Updated 4 years ago
- ☆78Updated 9 months ago
- This repository contains the official implementation code of the paper Improving Multimodal Fusion with Hierarchical Mutual Information M…☆189Updated 2 years ago
- A Tool for extracting multimodal features from videos.☆183Updated 2 years ago
- Make Acoustic and Visual Cues Matter: CH-SIMS v2.0 Dataset and AV-Mixup Consistent Module☆86Updated 3 years ago
- [Findings of NAACL 2024] Emotion-Anchored Contrastive Learning Framework for Emotion Recognition in Conversation☆35Updated 9 months ago
- The source code for the paper titled "Sentiment Knowledge Enhanced Attention Fusion Network (SKEAFN)".☆28Updated 2 years ago
- Source code for ICASSP 2022 paper "MM-DFN: Multimodal Dynamic Fusion Network For Emotion Recognition in Conversations".☆89Updated 2 years ago
- This repository contains the implementation of the paper -- Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment An…☆71Updated 2 years ago
- TCL-MAP is a powerful method for multimodal intent recognition (AAAI 2024)☆48Updated last year
- ☆37Updated 3 years ago
- M3ED: Multi-modal Multi-scene Multi-label Emotional Dialogue Database. ACL 2022☆114Updated 2 years ago
- ☆25Updated last year
- ☆80Updated 3 years ago
- A Unimodal Valence-Arousal Driven Contrastive Learning Framework for Multimodal Multi-Label Emotion Recognition (ACM MM 2024 oral)☆25Updated 10 months ago
- Multimodal Emotion Recognition in Conversation Challenge( CCAC 2023)☆37Updated last year
- ACM MM '22: Unified Multi-modal Pre-training for Few-shot Sentiment Analysis with Prompt-based Learning☆18Updated 2 years ago
- ☆22Updated 5 months ago
- [TMM2022] Source codes of CENet☆36Updated 2 years ago
- The offical realization of InstructERC☆142Updated 3 months ago
- ☆18Updated last year
- Codes for AoM: Detecting Aspect-oriented Information for Multimodal Aspect-Based Sentiment Analysis☆45Updated 2 years ago