alibaba-mmai-research / CLIP-FSAR
Code for our IJCV 2023 paper "CLIP-guided Prototype Modulating for Few-shot Action Recognition".
☆62Updated last year
Alternatives and similar repositories for CLIP-FSAR:
Users that are interested in CLIP-FSAR are comparing it to the libraries listed below
- Code for our CVPR 2023 paper "MoLo: Motion-augmented Long-short Contrastive Learning for Few-shot Action Recognition".☆43Updated last year
- The official repository for ICLR2024 paper "FROSTER: Frozen CLIP is a Strong Teacher for Open-Vocabulary Action Recognition"☆73Updated 2 months ago
- [ICCV'23] Official PyTorch implementation for paper "Exploring Predicate Visual Context in Detecting Human-Object Interactions"☆74Updated 8 months ago
- Video Test-Time Adaptation for Action Recognition (CVPR 2023)☆41Updated 5 months ago
- Official PyTorch code for the CVPR 2024 paper 'Part-aware Unified Representation of Language and Skeleton for Zero-shot Action Recognitio…☆30Updated 5 months ago
- Code for our CVPR 2022 Paper "Hybrid Relation Guided Set Matching for Few-shot Action Recognition".☆25Updated 2 years ago
- Code for our paper "HyRSM++: Hybrid Relation Guided Temporal Set Matching for Few-shot Action Recognition".☆13Updated 2 years ago
- Official code of ACM MM2024 paper- Unseen No More: Unlocking the Potential of CLIP for Generative Zero-shot HOI Detection☆19Updated 7 months ago
- [ECCV 2024] The official repo for "SA-DVAE: Improving Zero-Shot Skeleton-Based Action Recognition by Disentangled Variational Autoencoder…