xuguodong03 / UNIXKDLinks
☆27Updated 2 years ago
Alternatives and similar repositories for UNIXKD
Users that are interested in UNIXKD are comparing it to the libraries listed below
Sorting:
- ☆58Updated 4 years ago
- [CVPR 2021] "The Lottery Tickets Hypothesis for Supervised and Self-supervised Pre-training in Computer Vision Models" Tianlong Chen, Jon…☆68Updated 2 years ago
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Updated 2 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆82Updated 3 years ago
- [ICLR 2021] "Long Live the Lottery: The Existence of Winning Tickets in Lifelong Learning" by Tianlong Chen*, Zhenyu Zhang*, Sijia Liu, S…☆25Updated 3 years ago
- Cyclic Differentiable Architecture Search☆36Updated 3 years ago
- A pytorch implementation of the ICCV2021 workshop paper SimDis: Simple Distillation Baselines for Improving Small Self-supervised Models☆14Updated 4 years ago
- [ICLR 2020] Haotao Wang, Tianlong Chen, Zhangyang Wang, Kede Ma, "I Am Going MAD: Maximum Discrepancy Competition for Comparing Classifie…☆20Updated 3 years ago
- ☆31Updated 5 years ago
- ☆13Updated 3 years ago
- Compressing Representations for Self-Supervised Learning☆78Updated 4 years ago
- S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)☆65Updated 4 years ago
- ☆23Updated 6 years ago
- [NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang…☆89Updated 2 years ago
- Code for ViTAS_Vision Transformer Architecture Search☆51Updated 4 years ago
- [NeurIPS'22] What Makes a "Good" Data Augmentation in Knowledge Distillation -- A Statistical Perspective☆37Updated 2 years ago
- [ICLR 2022]: Fast AdvProp☆35Updated 3 years ago
- ☆20Updated 2 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆30Updated 4 years ago
- Repo for the paper "Extrapolating from a Single Image to a Thousand Classes using Distillation"☆37Updated last year
- Pytorch implementation of our paper accepted by IEEE TMM, 2022 --Learning Efficient GANs for Image Translation via Differentiable Masks a…☆54Updated 2 years ago
- Implementation of PGONAS for CVPR22W and RD-NAS for ICASSP23☆22Updated 2 years ago
- Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression☆19Updated 4 years ago
- Official Pytorch Implementation of: "Semantic Diversity Learning for Zero-Shot Multi-label Classification"(ICCV, 2021) paper☆31Updated 3 years ago
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆31Updated 5 years ago
- Paper and Code for "Curriculum Learning by Optimizing Learning Dynamics" (AISTATS 2021)☆19Updated 4 years ago
- Code of our Neurips2020 paper "Auto Learning Attention", coming soon☆22Updated 4 years ago
- [ICLR'23] Trainability Preserving Neural Pruning (PyTorch)☆34Updated 2 years ago
- Improving Contrastive Learning by Visualizing Feature Transformation, ICCV 2021 Oral☆90Updated 4 years ago
- Codes for DATA: Differentiable ArchiTecture Approximation.☆11Updated 4 years ago