Code and pretrained models for paper: Data-Free Adversarial Distillation
☆99Nov 28, 2022Updated 3 years ago
Alternatives and similar repositories for Data-Free-Adversarial-Distillation
Users that are interested in Data-Free-Adversarial-Distillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Accompanying code for the paper "Zero-shot Knowledge Transfer via Adversarial Belief Matching"☆143Apr 29, 2020Updated 5 years ago
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆74Apr 7, 2022Updated 3 years ago
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆76Oct 24, 2022Updated 3 years ago
- Official PyTorch implementation of Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion (CVPR 2020)☆516Jan 25, 2023Updated 3 years ago
- Knowledge Amalgamation Engine☆99Feb 28, 2024Updated 2 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- ZSKD with PyTorch☆31Jun 26, 2023Updated 2 years ago
- Zero-Shot Knowledge Distillation in Deep Networks in ICML2019☆49Jun 20, 2019Updated 6 years ago
- MaskRCNN with Knowledge Distillation☆21Nov 6, 2020Updated 5 years ago
- ☆23Dec 31, 2020Updated 5 years ago
- Efficient computing methods developed by Huawei Noah's Ark Lab☆1,306Nov 5, 2024Updated last year
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆18Mar 21, 2023Updated 3 years ago
- Knowledge distillation (KD) from a decision-based black-box (DB3) teacher without training data.☆22May 3, 2022Updated 3 years ago
- Zero-Shot Knowledge Distillation in Deep Networks☆67Apr 16, 2022Updated 3 years ago
- A Pytorch implementation of "Data-Free Learning of Student Networks" (ICCV 2019).☆18Oct 8, 2019Updated 6 years ago
- DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- ☆14Oct 6, 2023Updated 2 years ago
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆35Jul 25, 2024Updated last year
- This repository contains the first model which I tried on cloud dataset from sentinel satellite for cloud segmentation. The model was UNE…☆13Jun 1, 2022Updated 3 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆414May 17, 2021Updated 4 years ago
- PyTorch implementation of AmalgamateGNN (CVPR'21)☆21Jul 29, 2022Updated 3 years ago
- Knowledge Extraction with No Observable Data (NeurIPS 2019)☆46Jan 9, 2020Updated 6 years ago
- Implementation of several knowledge distillation techniques on PyTorch☆15Feb 25, 2019Updated 7 years ago
- Companion code to the preprint: E Bıyık, K Wang, N Anari, D Sadigh, "Batch Active Learning using Determinantal Point Processes". arXiv pr…☆14Jul 25, 2024Updated last year
- An interactive demo based on Segment-Anything for stroke-based painting which enables human-like painting.☆35Apr 16, 2023Updated 2 years ago
- DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- Data-Free Knowledge Distillation☆22May 30, 2022Updated 3 years ago
- [NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data☆45Oct 27, 2022Updated 3 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- Official PyTorch implementation of "Cross-Domain Ensemble Distillation for Domain Generalization" (ECCV 2022)☆25Dec 11, 2024Updated last year
- ☆13Aug 23, 2018Updated 7 years ago
- Data-Free Network Quantization With Adversarial Knowledge Distillation PyTorch☆30Sep 1, 2021Updated 4 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,426Oct 16, 2023Updated 2 years ago
- Black-Box Ripper: Copying black-box models using generative evolutionary algorithms - NIPS 2020 - Official Implementation☆29Oct 25, 2020Updated 5 years ago
- [ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers☆23Jul 7, 2024Updated last year
- DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- [AAAI 2024] Data-Free Hard-Label Robustness Stealing Attack☆15Mar 29, 2024Updated last year
- Unofficial Pytorch Lightning implementation of Contrastive Syn-to-Real Generalization (ICLR, 2021)☆17Aug 11, 2021Updated 4 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,745Nov 25, 2021Updated 4 years ago
- 2nd place solution of ECCV 2020 workshop VIPriors Image Classification Challenge, https://arxiv.org/abs/2008.00261☆13Aug 22, 2021Updated 4 years ago
- EMNLP 2022: Analyzing and Evaluating Faithfulness in Dialogue Summarization☆13Mar 20, 2025Updated last year
- Code for "Revisiting Batch Norm Initialization".☆12Jul 14, 2022Updated 3 years ago
- YOLOv3 on TensorFlow Core API and Edge Deployment Using the Intel's Neural Compute Stick Series☆12Nov 29, 2019Updated 6 years ago