IntelLabs / continuallearningLinks
☆26Updated 11 months ago
Alternatives and similar repositories for continuallearning
Users that are interested in continuallearning are comparing it to the libraries listed below
Sorting:
- ☆72Updated 2 years ago
- Code to reproduce the experiments of "Rethinking Experience Replay: a Bag of Tricks for Continual Learning"☆46Updated 2 years ago
- Official Code Repository for La-MAML: Look-Ahead Meta-Learning for Continual Learning"☆76Updated 4 years ago
- [NeurIPS 2020, Spotlight] Improved Schemes for Episodic Memory-based Lifelong Learning☆18Updated 4 years ago
- Pytorch implementation for "The Surprising Positive Knowledge Transfer in Continual 3D Object Shape Reconstruction"☆33Updated 2 years ago
- This repository demonstrates the application of our proposed task-free continual learning method on a synthetic experiment.☆13Updated 5 years ago
- IIRC: Incremental Implicitly Refined Classification☆30Updated 2 years ago
- Towards increasing stability of neural networks for continual learning: https://arxiv.org/abs/2006.06958.pdf (NeurIPS'20)☆75Updated 2 years ago
- Code for the CVPR 2023 Paper: "Real-Time Evaluation in Online Continual Learning: A New Hope"☆18Updated 9 months ago
- Code for the pubblication "Distilled Replay: Overcoming Forgetting through Synthetic Examples"☆13Updated 4 years ago
- Codebase for "Online Continual Learning with Maximally Interfered Retrieval"☆101Updated 2 years ago
- Code for "Supermasks in Superposition"☆124Updated last year
- Model Zoos for Continual Learning (ICLR 22)☆45Updated 2 years ago
- Codebase for "Online Fast Adaptation and Knowledge Accumulation: a New Approach to Continual Learning". This is a ServiceNow Research pro…☆105Updated 2 years ago
- Growing Dual-Memory Self-Organizing Networks☆25Updated 6 years ago
- Code for CVPR2021 paper: MOOD: Multi-level Out-of-distribution Detection☆38Updated last year
- Official PyTorch implementation of “Flexible Dataset Distillation: Learn Labels Instead of Images”☆42Updated 4 years ago
- Offical Repo for Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks. Accepted by Neurips 2020.