CMU-MultiComp-Lab / CMU-MultimodalSDK
☆162Updated 9 months ago
Related projects: ⓘ
- Data parser for the CMU-MultimodalSDK package including parsing for CMU-MOSEI, CMU-MOSI, and POM datasets☆24Updated last month
- ☆164Updated last year
- Multi-Modality Multi-Loss Fusion Network☆39Updated last month
- MultiModal Sentiment Analysis architectures for CMU-MOSEI.☆35Updated last year
- MISA: Modality-Invariant and -Specific Representations for Multimodal Sentiment Analysis☆187Updated last year
- Efficient Multimodal Transformer with Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis (TAC 2023)☆44Updated 10 months ago
- Make Acoustic and Visual Cues Matter: CH-SIMS v2.0 Dataset and AV-Mixup Consistent Module☆53Updated 2 years ago
- A Tool for extracting multimodal features from videos.☆131Updated last year
- An official implementation of "Decoupled Multimodal Distilling for Emotion Recognition" in PyTorch. (CVPR 2023 highlight)☆90Updated last year
- Learning Language-guided Adaptive Hyper-modality Representation for Multimodal Sentiment Analysis☆56Updated 6 months ago
- ☆11Updated 3 months ago
- The code repository for NAACL 2021 paper "Multimodal End-to-End Sparse Model for Emotion Recognition".☆94Updated last year
- [TMM2022] Source codes of CENet☆20Updated last year
- M-SENA: All-in-One Platform for Multimodal Sentiment Analysis☆78Updated 2 years ago
- ☆18Updated 5 years ago
- ☆36Updated last month
- ☆30Updated 2 years ago
- This repository contains the implementation of the paper -- Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment An…☆64Updated last year
- This repository contains the official implementation code of the paper Transformer-based Feature Reconstruction Network for Robust Multim…☆24Updated last year
- Source code for ICASSP 2022 paper "MM-DFN: Multimodal Dynamic Fusion Network For Emotion Recognition in Conversations"☆76Updated last year
- Pytorch implementation for Tailor Versatile Multi-modal Learning for Multi-label Emotion Recognition☆53Updated last year
- Toolkits for Multimodal Emotion Recognition☆150Updated 3 months ago
- The code for our IEEE ACCESS (2020) paper Multimodal Emotion Recognition with Transformer-Based Self Supervised Feature Fusion.☆109Updated 2 years ago
- ☆81Updated last year
- ☆40Updated 2 months ago
- MultiEMO: An Attention-Based Correlation-Aware Multimodal Fusion Framework for Emotion Recognition in Conversations (ACL 2023)☆43Updated 10 months ago
- Codes for paper "Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment Analysis"☆181Updated 2 years ago
- Code for paper "A Facial Expression-Aware Multimodal Multi-task Learning Framework for Emotion Recognition in Multi-party Conversations"☆55Updated 5 months ago
- The implementation of CubeMLP☆40Updated last year
- Code for MMLatch: Bottom-up Top-down Fusion for Multimodal Sentiment Analysis https://arxiv.org/abs/2201.09828 (to be presented in ICASSP…☆29Updated last year