DevPranjal / reproduction-review-kd
Reproduction of the CVPR'21 paper Distilling Knowledge via Knowledge Review for the ML Reproducibility Challenge 2021
☆10Updated 2 years ago
Alternatives and similar repositories for reproduction-review-kd:
Users that are interested in reproduction-review-kd are comparing it to the libraries listed below
- Our submission for the Microsoft Membership Inference Competion at SaTML 2023☆15Updated last year
- Winning solution of work done on model extraction over Vision Transformers such as Video-Swin-T and MoViNeT-A2-Base on Video Action-Recog…☆18Updated 2 years ago
- Solutions to the exercises in Dive into Deep Learning, in PyTorch☆10Updated 2 years ago
- Wheat detection using Faster RCNN☆15Updated last year
- Following research on S4 in jax☆14Updated 2 years ago
- ☆33Updated last year
- Knowledge distillation (KD) from a decision-based black-box (DB3) teacher without training data.☆21Updated 2 years ago
- This is a question bank for practicing Machine Learning for Interviews.☆33Updated 2 years ago
- Web-based Tool for visualisation and generation of adversarial examples by attacking ImageNet Models like VGG, AlexNet, ResNet etc.☆51Updated last year
- PyTorch Implementation for Paper "Toward Multimodal Image-to-Image Translation"☆10Updated 3 years ago
- Transformers trained on Tiny ImageNet☆50Updated 2 years ago
- The official code for the publication: "The Close Relationship Between Contrastive Learning and Meta-Learning".☆19Updated 2 years ago
- Data-enriching GAN for retrieving Representative Samples from aTrained Classifier☆12Updated 4 years ago
- ☆15Updated last year
- [NeurIPS 2022] Source code for our paper "Escaping Saddle Points for Effective Generalization on Class-Imbalanced Data"☆22Updated last year
- Code for the paper "Representational Continuity for Unsupervised Continual Learning" (ICLR 22)☆94Updated last year
- EnD: Entangling and Disentangling deep representations for bias correction | CVPR21 https://doi.org/10.1109/CVPR46437.2021.01330☆16Updated 3 years ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆73Updated 2 months ago
- Implementation of Contrastive Learning with Adversarial Examples☆28Updated 4 years ago
- This is the official implementation of the paper "Decoupled Adversarial Contrastive Learning for Self-supervised Adversarial Robustness,"…☆19Updated 6 months ago
- On the effectiveness of adversarial training against common corruptions [UAI 2022]☆30Updated 2 years ago
- [NeurIPS 2021] “When does Contrastive Learning Preserve Adversarial Robustness from Pretraining to Finetuning?”☆48Updated 3 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆81Updated 3 years ago
- Unofficial implementation of the DeepMind papers "Uncovering the Limits of Adversarial Training against Norm-Bounded Adversarial Examples…☆95Updated 2 years ago
- ICLR 2021 i-Mix: A Domain-Agnostic Strategy for Contrastive Representation Learning☆77Updated last year
- ☆53Updated last year
- Code for the paper "A Whac-A-Mole Dilemma Shortcuts Come in Multiples Where Mitigating One Amplifies Others"☆47Updated 6 months ago
- Official Code for Efficient and Effective Augmentation Strategy for Adversarial Training (NeurIPS-2022)☆16Updated last year
- Official code of "Discover and Mitigate Unknown Biases with Debiasing Alternate Networks" (ECCV 2022)☆23Updated last year
- Code for CVPR22 paper "Deep Unlearning via Randomized Conditionally Independent Hessians"☆25Updated 2 years ago