☆27Feb 6, 2021Updated 5 years ago
Alternatives and similar repositories for AE-KD
Users that are interested in AE-KD are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- This is the implementation for the ICASSP-2022 paper (Confidence-Aware Multi-Teacher Knowledge Distillation).☆63Feb 12, 2022Updated 4 years ago
- Graph Knowledge Distillation☆13Mar 6, 2020Updated 6 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- 2022_AAAI accepted paper, NaturalInversion:Data-Free Image Synthesis Improving Real-World Consistency☆11Mar 11, 2022Updated 4 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2021 -- Network Pruning using Adaptive Exemplar Filters☆24Apr 4, 2021Updated 5 years ago
- AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation☆66Mar 9, 2021Updated 5 years ago
- Incredible acceleration with pruning or the other compression techniques☆13Jul 7, 2021Updated 4 years ago
- (IJCAI 2019) Knowledge Amalgamation from Heterogeneous Networks by Common Feature Learning☆10Nov 25, 2022Updated 3 years ago
- Knowledge Amalgamation Engine☆100Feb 28, 2024Updated 2 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Jun 9, 2021Updated 4 years ago
- This is the official PyTorch implementation for the paper: "Directed Acyclic Graph Factorization Machines for CTR Prediction via Knowledg…☆14Mar 5, 2023Updated 3 years ago
- Reproducing VID in CVPR2019 (on working)☆20Nov 25, 2019Updated 6 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Official codebase for our paper "Joslim: Joint Widths and Weights Optimization for Slimmable Neural Networks"☆12Jun 30, 2021Updated 4 years ago
- Knowledge Amalgamation, Multi-teacher KD, Ensemble KD☆12Sep 21, 2021Updated 4 years ago
- Implementation for <Robust Weight Perturbation for Adversarial Training> in IJCAI'22.☆16Jul 1, 2022Updated 3 years ago
- ☆20Sep 28, 2020Updated 5 years ago
- Multi-Organ Foundation Model for Universal Ultrasound Image Segmentation with Task Prompt and Anatomical Prior☆16Sep 30, 2024Updated last year
- [CVPR'24] Official implementation of our paper "Self-Supervised Facial Representation Learning with Facial Region Awareness"☆15Mar 8, 2024Updated 2 years ago
- (CVPR2022) Official PyTorch Implementation of KDEP. Knowledge Distillation as Efficient Pre-training: Faster Convergence, Higher Data-eff…☆60Jul 21, 2022Updated 3 years ago
- [AAAI 2025] QCS:Feature Refining from Quadruplet Cross Similarity for Facial Expression Recognition☆21Jul 3, 2025Updated 9 months ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- Serverless GPU API endpoints on Runpod - Bonus Credits • AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- Latest Weight Averaging (NeurIPS HITY 2022)☆33Jun 20, 2023Updated 2 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆15Oct 22, 2020Updated 5 years ago
- Switchable Online Knowledge Distillation☆19Oct 27, 2024Updated last year
- Role-Wise Data Augmentation for Knowledge Distillation☆19Nov 22, 2022Updated 3 years ago
- ☆50Jun 12, 2023Updated 2 years ago
- pytorch implementation for paper, towards realistic predictors☆17Sep 26, 2018Updated 7 years ago
- Topology Distillation for Recommender System (KDD'21)☆13Sep 2, 2021Updated 4 years ago
- A repository for Prototype Attention-based Multiple Instance Learning.☆14Jun 29, 2024Updated last year
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- Official code for Cumulative Spatial Knowledge Distillation for Vision Transformers (ICCV-2023) https://openaccess.thecvf.com/content/ICC…☆15Nov 5, 2023Updated 2 years ago
- 3DUS-CT.MRI Registration for Liver Tumor Ablation☆11Oct 15, 2025Updated 5 months ago
- Code of Data-Free Knowledge Distillation via Feature Exchange and Activation Region Constraint☆21Oct 23, 2023Updated 2 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- ☆61Apr 24, 2020Updated 5 years ago
- 2019~2021年间Zero-shot/Data-free知识蒸馏的论文合集☆11Sep 8, 2021Updated 4 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆182Jan 29, 2022Updated 4 years ago