ZhangLeUestc / PersEmoN
PersEmoN: A Deep Network for Joint Analysis of Apparent Personality, Emotion and Their Relationship
☆12Updated 5 years ago
Alternatives and similar repositories for PersEmoN:
Users that are interested in PersEmoN are comparing it to the libraries listed below
- ☆17Updated 2 years ago
- We developed an interpretable CNN for big five personality traits using human speech data. This project discovers the different frequency…☆14Updated 2 months ago
- Multimodal classification solution for the SIGIR eCOM using Co-attention and transformer language models☆19Updated 4 years ago
- Banchmark for personality traits prediction with neural networks☆54Updated 6 months ago
- [AAAI2021] A repository of Contrastive Adversarial Learning for Person-independent FER☆16Updated 3 years ago
- End-to-end multimodal emotion and gender recognition with dynamic weights of joint loss☆10Updated 6 years ago
- Human Personality Traits Recognition from short introduction videos of users from Youtube (ECCV16 Challenge)☆32Updated 9 months ago
- ☆27Updated 3 years ago
- 🏆 The 2nd Place Submission to the CVPR2021-Evoked Emotion from Videos challenge.☆17Updated 3 years ago
- Philo: uniting modalities☆24Updated last month
- ☆48Updated 6 years ago
- Deep Impression: Audiovisual Deep Residual Networks for Multimodal Apparent Personality Trait Recognition☆42Updated 8 years ago
- The repository contains code, documentation of the code used in ChaLearn First Impressions Analysis challenge (ECCV - 2016)☆29Updated 2 years ago
- ☆16Updated 4 years ago
- Multi-modal Multi-label Emotion Recognition with Heterogeneous Hierarchical Message Passing☆16Updated 2 years ago
- [ICLR 2019] Learning Factorized Multimodal Representations☆67Updated 4 years ago
- Released code and data for "Frame-Transformer Emotion Classification Network." ICMR 2017☆18Updated 7 years ago
- Code on selecting an action based on multimodal inputs. Here in this case inputs are voice and text.☆73Updated 3 years ago
- Generalized cross-modal NNs; new audiovisual benchmark (IEEE TNNLS 2019)☆26Updated 5 years ago
- An emotion extraction system for images, that extracts emotion which will be felt by the user of viewing the image, representing them in …☆33Updated 7 years ago
- Code and dataset of "MEmoR: A Dataset for Multimodal Emotion Reasoning in Videos" in MM'20.☆53Updated last year
- Code for the KDD 2018 paper "Multimodal Sentiment Analysis to Explore the Structure of Emotions".☆52Updated 6 years ago
- MIMAMO Net: Integrating Micro- and Macro-motion for Video Emotion Recognition☆60Updated 4 years ago
- ☆64Updated 5 years ago
- Offical implementation of paper "MSAF: Multimodal Split Attention Fusion"☆81Updated 3 years ago
- Code for NAACL 2021 paper: MTAG: Modal-Temporal Attention Graph for Unaligned Human Multimodal Language Sequences☆42Updated 2 years ago
- 🔆 📝 A reading list focused on Multimodal Emotion Recognition (MER) 👂👄 👀 💬☆121Updated 4 years ago
- ☆31Updated 3 years ago
- The code repository for NAACL 2021 paper "Multimodal End-to-End Sparse Model for Emotion Recognition".☆102Updated 2 years ago
- BB-SVM model for automatic personality detection of the essays dataset (Big-Five personality labeled traits)☆32Updated 3 years ago