drmuskangarg / Multimodal-datasetsLinks
This repository is build in association with our position paper on "Multimodality for NLP-Centered Applications: Resources, Advances and Frontiers". As a part of this release we share the information about recent multimodal datasets which are available for research purposes. We found that although 100+ multimodal language resources are availab…
☆308Updated 3 years ago
Alternatives and similar repositories for Multimodal-datasets
Users that are interested in Multimodal-datasets are comparing it to the libraries listed below
Sorting:
- Hate-CLIPper: Multimodal Hateful Meme Classification with Explicit Cross-modal Interaction of CLIP features - Accepted at EMNLP 2022 Work…☆54Updated 5 months ago
- This repository provides a comprehensive collection of research papers focused on multimodal representation learning, all of which have b…☆81Updated 3 months ago
- EMNLP 2023 Papers: Explore cutting-edge research from EMNLP 2023, the premier conference for advancing empirical methods in natural langu…☆109Updated last year
- [NeurIPS 2021] Multiscale Benchmarks for Multimodal Representation Learning☆578Updated last year
- A Survey on multimodal learning research.☆331Updated 2 years ago
- [ICLR 2023] MultiViz: Towards Visualizing and Understanding Multimodal Models☆98Updated last year
- [Paperlist] Awesome paper list of multimodal dialog, including methods, datasets and metrics☆37Updated 8 months ago
- MIntRec: A New Dataset for Multimodal Intent Recognition (ACM MM 2022)☆112Updated 5 months ago
- This repository presents UR-FUNNY dataset: first dataset for multimodal humor detection☆146Updated 4 years ago
- ☆37Updated last year
- Multimodal Sarcasm Detection Dataset☆356Updated last year
- Dataset and Code for Multimodal Fact Checking and Explanation Generation (Mocheg)☆58Updated last year
- Recent Advances in Vision and Language Pre-training (VLP)☆294Updated 2 years ago
- [ACL2023] Code and dataset for paper "MMSD2.0: Towards a Reliable Multi-modal Sarcasm Detection System"☆45Updated last year
- Must-read Papers on Large Language Model (LLM) Continual Learning☆145Updated last year
- Extensive acceptance rates and information of main AI conferences☆165Updated last year
- [T-PAMI] A curated list of self-supervised multimodal learning resources.☆263Updated last year
- Research Trends in LLM-guided Multimodal Learning.☆355Updated last year
- [MIR-2023-Survey] A continuously updated paper list for multi-modal pre-trained big models☆289Updated 2 months ago
- Official implementation of Dynamic Routing Transformer Network for Multimodal Sarcasm Detection (ACL'23)☆34Updated 2 years ago
- ☆23Updated 3 years ago
- Explainable Multimodal Emotion Reasoning (EMER), OV-MER (ICML), and AffectGPT (ICML, Oral)☆259Updated 2 months ago
- ☆210Updated 3 years ago
- The repository collects many various multi-modal transformer architectures, including image transformer, video transformer, image-languag…☆229Updated 3 years ago
- Resources (conference/journal publications, references to dataset) for harmful memes detection.☆48Updated 3 years ago
- Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning☆197Updated last year
- ☆177Updated last year
- This is the official implementation of the paper "MM-SHAP: A Performance-agnostic Metric for Measuring Multimodal Contributions in Vision…☆31Updated last year
- Official implementation of Towards Multi-Modal Sarcasm Detection via Hierarchical Congruity Modeling with Knowledge Enhancement.☆40Updated last year
- Paper list about multimodal and large language models, only used to record papers I read in the daily arxiv for personal needs.☆738Updated this week