Code and pretrained models for paper: Data-Free Adversarial Distillation
☆99Nov 28, 2022Updated 3 years ago
Alternatives and similar repositories for Data-Free-Adversarial-Distillation
Users that are interested in Data-Free-Adversarial-Distillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Accompanying code for the paper "Zero-shot Knowledge Transfer via Adversarial Belief Matching"☆143Apr 29, 2020Updated 5 years ago
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆75Apr 7, 2022Updated 4 years ago
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆76Oct 24, 2022Updated 3 years ago
- Official PyTorch implementation of Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion (CVPR 2020)☆517Jan 25, 2023Updated 3 years ago
- Knowledge Amalgamation Engine☆100Feb 28, 2024Updated 2 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- ZSKD with PyTorch☆31Jun 26, 2023Updated 2 years ago
- Zero-Shot Knowledge Distillation in Deep Networks in ICML2019☆48Jun 20, 2019Updated 6 years ago
- MaskRCNN with Knowledge Distillation☆21Nov 6, 2020Updated 5 years ago
- Efficient computing methods developed by Huawei Noah's Ark Lab☆1,303Nov 5, 2024Updated last year
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆18Mar 21, 2023Updated 3 years ago
- Knowledge distillation (KD) from a decision-based black-box (DB3) teacher without training data.☆22May 3, 2022Updated 3 years ago
- Zero-Shot Knowledge Distillation in Deep Networks☆67Apr 16, 2022Updated 4 years ago
- A Pytorch implementation of "Data-Free Learning of Student Networks" (ICCV 2019).☆18Oct 8, 2019Updated 6 years ago
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆35Jul 25, 2024Updated last year
- Serverless GPU API endpoints on Runpod - Bonus Credits • AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- This repository contains the first model which I tried on cloud dataset from sentinel satellite for cloud segmentation. The model was UNE…☆13Jun 1, 2022Updated 3 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆416May 17, 2021Updated 4 years ago
- Knowledge Extraction with No Observable Data (NeurIPS 2019)☆46Jan 9, 2020Updated 6 years ago
- Implementation of several knowledge distillation techniques on PyTorch☆15Feb 25, 2019Updated 7 years ago
- Companion code to the preprint: E Bıyık, K Wang, N Anari, D Sadigh, "Batch Active Learning using Determinantal Point Processes". arXiv pr…☆15Jul 25, 2024Updated last year
- Data-Free Knowledge Distillation☆22May 30, 2022Updated 3 years ago
- [NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data☆45Oct 27, 2022Updated 3 years ago
- Official Source Code of "Exploring Effective Data for Surrogate Training Towards Black-box Attack" and "STDatav2: Accessing Efficient Bla…☆21Apr 16, 2025Updated last year
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- Serverless GPU API endpoints on Runpod - Bonus Credits • AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- Official PyTorch implementation of "Cross-Domain Ensemble Distillation for Domain Generalization" (ECCV 2022)☆25Dec 11, 2024Updated last year
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- Code release for "BoxVIS: Video Instance Segmentation with Box Annotation"☆12Dec 22, 2023Updated 2 years ago
- ☆13Aug 23, 2018Updated 7 years ago
- Data-Free Network Quantization With Adversarial Knowledge Distillation PyTorch☆30Sep 1, 2021Updated 4 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,427Oct 16, 2023Updated 2 years ago
- (CVPR 2020 oral) DEPARA : Deep Attribution Graph for Deep Knowledge Transferbility☆41Mar 16, 2020Updated 6 years ago
- [ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers☆23Jul 7, 2024Updated last year
- Universal Character Recognizer (UCR): Simple, Intuitive, Extensible, Multi-Lingual OCR engine☆15Apr 23, 2021Updated 4 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- [AAAI 2024] Data-Free Hard-Label Robustness Stealing Attack☆15Mar 29, 2024Updated 2 years ago
- Implementation of Pre-text invariant representation learning algorithm in pytorch☆11May 27, 2020Updated 5 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,745Nov 25, 2021Updated 4 years ago
- 2nd place solution of ECCV 2020 workshop VIPriors Image Classification Challenge, https://arxiv.org/abs/2008.00261☆13Aug 22, 2021Updated 4 years ago
- Code for "Revisiting Batch Norm Initialization".☆12Jul 14, 2022Updated 3 years ago
- ☆23Dec 23, 2022Updated 3 years ago
- ☆24Apr 29, 2022Updated 3 years ago