yasar-rehman / FEDVSSLLinks
This is the official impelementation of "FVSSL Algorithm"
☆24Updated last year
Alternatives and similar repositories for FEDVSSL
Users that are interested in FEDVSSL are comparing it to the libraries listed below
Sorting:
- FedUL: Federated Learning from Only Unlabeled Data with Class-Conditional-Sharing Clients☆32Updated 2 years ago
- ☆27Updated 2 years ago
- Code for CVPR2021 paper: MOOD: Multi-level Out-of-distribution Detection☆38Updated last year
- ZSKD with PyTorch☆31Updated 2 years ago
- ☆21Updated 4 years ago
- ICLR 2023 - FedFA: Federated Feature Augmentation☆58Updated 2 years ago
- codes for Neural Architecture Ranker and detailed cell information datasets based on NAS-Bench series☆12Updated 3 years ago
- [ICLR2023] Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning (https://arxiv.org/abs/2210.0022…☆40Updated 2 years ago
- ☆22Updated 5 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆82Updated 3 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆30Updated 3 years ago
- A generic code base for neural network pruning, especially for pruning at initialization.☆31Updated 2 years ago
- ☆13Updated 3 years ago
- Pytorch implementation of Feature Generation for Long-Tail Classification by Rahul Vigneswaran, Marc T Law, Vineeth N Balasubramaniam and…☆38Updated 2 years ago
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Updated 2 years ago
- ☆27Updated 4 years ago
- [ICML'21] Estimate the accuracy of the classifier in various environments through self-supervision☆27Updated 4 years ago
- Understanding Self-Supervised Learning in a non-IID Setting☆21Updated 2 years ago
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆42Updated 2 years ago
- ☆23Updated last year
- Repo for the paper "Extrapolating from a Single Image to a Thousand Classes using Distillation"☆37Updated last year
- ☆57Updated 4 years ago
- [ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers☆23Updated last year
- Official PyTorch implementation of “Flexible Dataset Distillation: Learn Labels Instead of Images”☆42Updated 4 years ago
- TF-FD☆20Updated 2 years ago
- [ECCV2022] The PyTorch implementation of paper "Equivariance and Invariance Inductive Bias for Learning from Insufficient Data"☆19Updated 2 years ago
- [NeurIPS 2022] Make Sharpness-Aware Minimization Stronger: A Sparsified Perturbation Approach -- Official Implementation☆45Updated 2 years ago
- PyTorch code for the ICCV'21 paper: "Always Be Dreaming: A New Approach for Class-Incremental Learning"☆64Updated 2 years ago
- [NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data☆45Updated 2 years ago
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆113Updated last year