Code and pretrained models for paper: Data-Free Adversarial Distillation
☆100Nov 28, 2022Updated 3 years ago
Alternatives and similar repositories for Data-Free-Adversarial-Distillation
Users that are interested in Data-Free-Adversarial-Distillation are comparing it to the libraries listed below
Sorting:
- Accompanying code for the paper "Zero-shot Knowledge Transfer via Adversarial Belief Matching"☆144Apr 29, 2020Updated 5 years ago
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆73Apr 7, 2022Updated 3 years ago
- Code of Data-Free Knowledge Distillation via Feature Exchange and Activation Region Constraint☆21Oct 23, 2023Updated 2 years ago
- Official PyTorch implementation of Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion (CVPR 2020)☆516Jan 25, 2023Updated 3 years ago
- CVPR 2021 Official repository for the Data-Free Model Extraction paper. https://arxiv.org/abs/2011.14779☆76Apr 1, 2024Updated last year
- Knowledge Amalgamation Engine☆99Feb 28, 2024Updated 2 years ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆18Mar 21, 2023Updated 2 years ago
- Efficient computing methods developed by Huawei Noah's Ark Lab☆1,306Nov 5, 2024Updated last year
- ZSKD with PyTorch☆31Jun 26, 2023Updated 2 years ago
- Zero-Shot Knowledge Distillation in Deep Networks in ICML2019☆49Jun 20, 2019Updated 6 years ago
- Official implementation of the paper "Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay" (AAAI-2…☆18May 5, 2022Updated 3 years ago
- A Pytorch implementation of "Data-Free Learning of Student Networks" (ICCV 2019).☆18Oct 8, 2019Updated 6 years ago
- Official PyTorch implementation of "Cross-Domain Ensemble Distillation for Domain Generalization" (ECCV 2022)☆25Dec 11, 2024Updated last year
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- Knowledge distillation (KD) from a decision-based black-box (DB3) teacher without training data.☆22May 3, 2022Updated 3 years ago
- ICME2022 Special Session “Beyond Accuracy: Responsible, Responsive, and Robust Multimedia Retrieval ”☆12Jun 3, 2024Updated last year
- 2nd place solution of ECCV 2020 workshop VIPriors Image Classification Challenge, https://arxiv.org/abs/2008.00261☆13Aug 22, 2021Updated 4 years ago
- Code release for "BoxVIS: Video Instance Segmentation with Box Annotation"☆12Dec 22, 2023Updated 2 years ago
- Knowledge Extraction with No Observable Data (NeurIPS 2019)☆46Jan 9, 2020Updated 6 years ago
- ☆13Aug 23, 2018Updated 7 years ago
- ☆14Oct 6, 2023Updated 2 years ago
- Companion code to the preprint: E Bıyık, K Wang, N Anari, D Sadigh, "Batch Active Learning using Determinantal Point Processes". arXiv pr…☆14Jul 25, 2024Updated last year
- Implementation of several knowledge distillation techniques on PyTorch☆15Feb 25, 2019Updated 7 years ago
- ☆13Aug 25, 2023Updated 2 years ago
- Universal Character Recognizer (UCR): Simple, Intuitive, Extensible, Multi-Lingual OCR engine☆15Apr 23, 2021Updated 4 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆413May 17, 2021Updated 4 years ago
- Code for "Everybody Prune Now: Structured Pruning of LLMs with only Forward Passes"☆30Mar 28, 2024Updated last year
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- Unofficial Pytorch Lightning implementation of Contrastive Syn-to-Real Generalization (ICLR, 2021)☆17Aug 11, 2021Updated 4 years ago
- Arxiv's ML papers network graph and browser☆80Jan 26, 2019Updated 7 years ago
- Unofficial implementation of the paper 'Adversarial Training for Free'☆23May 8, 2019Updated 6 years ago
- Private Adaptive Optimization with Side Information (ICML '22)☆16Jun 23, 2022Updated 3 years ago
- [NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data☆45Oct 27, 2022Updated 3 years ago
- code for ICML 2021 paper in which we explore the relationship between adversarial transferability and knowledge transferability.☆17Dec 8, 2022Updated 3 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,425Oct 16, 2023Updated 2 years ago
- Self-supervised Label Augmentation via Input Transformations (ICML 2020)☆107Nov 28, 2020Updated 5 years ago
- Computer vision web application, built to predict the age, race, and gender of all individuals present in an image. Trained using PyTorch…☆19Dec 8, 2022Updated 3 years ago
- Continual Model Generalization for Unseen Domains☆45Mar 17, 2023Updated 2 years ago
- Compressing Representations for Self-Supervised Learning☆80Feb 18, 2021Updated 5 years ago