DerekGrant / Semi-Supervised-Lifelong-Learning-for-Sentiment-ClassificationLinks
Lifelong machine learning is a novel machine learning paradigm which continually learns tasks and accumulates knowledge for reusing. The knowledge extracting and reusing abilities enable the lifelong machine learning to understand the knowledge for solving a task and obtain the ability to solve the related problems. In sentiment classification, …
☆10Updated 5 years ago
Alternatives and similar repositories for Semi-Supervised-Lifelong-Learning-for-Sentiment-Classification
Users that are interested in Semi-Supervised-Lifelong-Learning-for-Sentiment-Classification are comparing it to the libraries listed below
Sorting:
- ☆10Updated 4 years ago
- Codebase for the paper "A Gradient Flow Framework for Analyzing Network Pruning"☆21Updated 4 years ago
- ☆21Updated 5 years ago
- ☆19Updated 5 years ago
- [ICLR-2020] Dynamic Sparse Training: Find Efficient Sparse Network From Scratch With Trainable Masked Layers.☆31Updated 5 years ago
- Official Implementation of the ECCV 2022 Paper "Class-Incremental Learning with Cross-Space Clustering and Controlled Transfer"☆17Updated 2 years ago
- ☆9Updated 2 years ago
- Energy-Based Models for Continual Learning Official Repository (PyTorch)☆42Updated 2 years ago
- ☆26Updated 4 years ago
- Official Code Repository for La-MAML: Look-Ahead Meta-Learning for Continual Learning"☆76Updated 4 years ago
- Continual learning using variational prototype replays☆9Updated 4 years ago
- Paper and Code for "Curriculum Learning by Optimizing Learning Dynamics" (AISTATS 2021)☆19Updated 3 years ago
- The implementation code for Uncertainty-based Continual Learning with Adaptive Regularization (Neurips 2019)☆35Updated 4 years ago
- [ECMLPKDD 2020] "Topological Insights into Sparse Neural Networks"☆13Updated 3 years ago
- The original code for the paper "Benchmarks for Continual Few-Shot Learning".☆35Updated 4 years ago
- Role-Wise Data Augmentation for Knowledge Distillation☆19Updated 2 years ago
- ☆13Updated 3 years ago
- Q. Yao, H. Yang, B. Han, G. Niu, J. Kwok. Searching to Exploit Memorization Effect in Learning from Noisy Labels. ICML 2020☆22Updated 4 years ago
- Good Subnetworks Provably Exist: Pruning via Greedy Forward Selection☆21Updated 4 years ago
- The original code for the data providers and the datasets of the paper "Defining Benchmarks for Continual Few-Shot Learning".☆16Updated 5 years ago
- ☆22Updated last year
- This repository demonstrates the application of our proposed task-free continual learning method on a synthetic experiment.☆13Updated 5 years ago
- Source code of ICLR2020 paper 'Towards Fast Adaptation of Neural Architectures with Meta Learning'☆27Updated 2 years ago
- Learning To Stop While Learning To Predict☆34Updated 2 years ago
- Code for the paper "Selecting Relevant Features from a Universal Representation for Few-shot Classification"☆41Updated 4 years ago
- ☆10Updated 5 years ago
- released code for the paper: ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse Coding☆31Updated 4 years ago
- PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"☆20Updated 4 years ago
- ☆42Updated 4 years ago
- Codebase for the paper titled "Continual learning with local module selection"☆25Updated 3 years ago