NeuralCollapseApplications / FSCIL
[ICLR 2023] The official code for our ICLR 2023 (top25%) paper: "Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class-Incremental Learning"
☆90Updated last year
Alternatives and similar repositories for FSCIL:
Users that are interested in FSCIL are comparing it to the libraries listed below
- [CVPR 2023] Learning with Fantasy: Semantic-Aware Virtual Contrastive Constraint for Few-Shot Class-Incremental Learning☆70Updated last year
- The code repository for "Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks" (TPAMI 2023)☆36Updated last year
- Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need (IJCV 2024)☆137Updated 8 months ago
- RanPAC: Random Projections and Pre-trained Models for Continual Learning - Official code repository for NeurIPS 2023 Published Paper☆47Updated 2 months ago
- The code repository for "Few-Shot Class-Incremental Learning via Training-Free Prototype Calibration" (NeurIPS'23) in PyTorch☆54Updated last year
- The code repository for "Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning"(CVPR24) in PyTorch.☆70Updated last month
- ☆12Updated 2 years ago
- ☆31Updated last year
- CVPR2023: AttriCLIP: A Non-Incremental Learner for Incremental Knowledge Learning☆15Updated last year
- Official code of "Generating Instance-level Prompts for Rehearsal-free Continual Learning (ICCV 2023)"☆42Updated last year
- ☆49Updated 2 years ago
- Official repository for "CLIP model is an Efficient Continual Learner".☆95Updated 2 years ago
- Code for NeurIPS 2022 paper “S-Prompts Learning with Pre-trained Transformers: An Occam’s Razor for Domain Incremental Learning“☆98Updated 7 months ago
- ☆21Updated 4 months ago
- Official implementation of the paper "Masked Autoencoders are Efficient Class Incremental Learners"☆42Updated 11 months ago
- The official implementation for ECCV22 paper: "FOSTER: Feature Boosting and Compression for Class-Incremental Learning" in PyTorch.