iamwangyabin / ESNLinks
Official implementation of AAAI 2023 paper "Isolation and Impartial Aggregation: A Paradigm of Incremental Learning without Interference".
☆38Updated 2 years ago
Alternatives and similar repositories for ESN
Users that are interested in ESN are comparing it to the libraries listed below
Sorting:
- Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need (IJCV 2024)☆153Updated last year
- Class-Incremental Learning: A Survey (TPAMI 2024)☆278Updated last year
- The official implementation for ECCV22 paper: "FOSTER: Feature Boosting and Compression for Class-Incremental Learning" in PyTorch.☆58Updated 3 years ago
- ICLR2024 paper on Continual Learning☆37Updated last year
- Code for NeurIPS 2022 paper “S-Prompts Learning with Pre-trained Transformers: An Occam’s Razor for Domain Incremental Learning“☆106Updated last year
- PyTorch Implementation of Learning to Prompt (L2P) for Continual Learning @ CVPR22☆203Updated 2 years ago
- The code repository for "Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks" (TPAMI 2023)☆38Updated 2 years ago
- Code for ICLR 2023 paper (Oral) — Towards Stable Test-Time Adaptation in Dynamic Wild World☆202Updated 2 years ago
- PyTorch Implementation of DualPrompt: Complementary Prompting for Rehearsal-free Continual Learning @ ECCV22☆127Updated 3 years ago
- [CVPR 2023] Learning with Fantasy: Semantic-Aware Virtual Contrastive Constraint for Few-Shot Class-Incremental Learning☆74Updated last year
- PyTorch code for the CVPR'23 paper: "CODA-Prompt: COntinual Decomposed Attention-based Prompting for Rehearsal-Free Continual Learning"☆156Updated 2 years ago
- [CVPR 2024] The code repository for "Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning" in PyTorch.☆81Updated 10 months ago
- RanPAC: Random Projections and Pre-trained Models for Continual Learning - Official code repository for NeurIPS 2023 Published Paper☆57Updated 11 months ago
- ☆56Updated 2 years ago
- [ICCV 2023] A Unified Continual Learning Framework with General Parameter-Efficient Tuning☆92Updated last year
- [ICLR 2023] The official code for our ICLR 2023 (top25%) paper: "Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class…☆93Updated 2 years ago
- Code for CVPR2022 paper "Not Just Selection, but Exploration: Online Class-Incremental Continual Learning via Dual View Consistency"☆27Updated 3 years ago
- Code Implementation for CVPR 2023 Paper: Class-Incremental Exemplar Compression for Class-Incremental Learning☆26Updated 2 years ago
- Official repository for Online Class Incremental Learning on Stochastic Blurry Task Boundary via Mask and Visual Prompt Tuning on ICCV 20…☆27Updated last year
- The code repository for "Few-Shot Class-Incremental Learning via Training-Free Prototype Calibration" (NeurIPS'23) in PyTorch☆69Updated 2 years ago
- The code repository for "A Model or 603 Exemplars: Towards Memory-Efficient Class-Incremental Learning" (ICLR'23) in PyTorch☆46Updated 2 years ago
- Code for ICML 2022 paper — Efficient Test-Time Model Adaptation without Forgetting☆135Updated 2 years ago
- Online Prototype Learning for Online Continual Learning (ICCV 2023)☆44Updated 2 years ago
- Forward Compatible Few-Shot Class-Incremental Learning (CVPR'22)☆137Updated 2 years ago
- Official implementation of the paper "Masked Autoencoders are Efficient Class Incremental Learners"☆45Updated last year
- PyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"☆119Updated 2 years ago
- [CVPR 2024] PriViLege: Pre-trained Vision and Language Transformers Are Few-Shot Incremental Learners☆55Updated last year
- Official repository for "CLIP model is an Efficient Continual Learner".☆108Updated 3 years ago
- Hierarchical Decomposition of Prompt-Based Continual Learning: Rethinking Obscured Sub-optimality (NeurIPS 2023, Spotlight)☆90Updated last year
- Dynamic Token Expansion with Continual Transformers, accepted at CVPR 2022☆147Updated 3 years ago