peterljq / Tutorial-of-Data-Distillation-and-CondensationLinks
A comprehensive overview of Data Distillation and Condensation (DDC). DDC is a data-centric task where a representative (i.e., small but training-effective) batch of data is generated from the large dataset.
☆13Updated 2 years ago
Alternatives and similar repositories for Tutorial-of-Data-Distillation-and-Condensation
Users that are interested in Tutorial-of-Data-Distillation-and-Condensation are comparing it to the libraries listed below
Sorting:
- ☆11Updated 2 years ago
- codes for ICML2021 paper iDARTS: Differentiable Architecture Search with Stochastic Implicit Gradients☆10Updated 4 years ago
- [ICML 2021] "Efficient Lottery Ticket Finding: Less Data is More" by Zhenyu Zhang*, Xuxi Chen*, Tianlong Chen*, Zhangyang Wang☆25Updated 3 years ago
- Code for Double Blind CollaborativeLearning (DBCL)☆14Updated 4 years ago
- Paper List for In-context Learning 🌷☆20Updated 2 years ago
- Code associated with the paper **Fine-tuning Language Models over Slow Networks using Activation Compression with Guarantees**.☆28Updated 2 years ago
- ☆22Updated 2 years ago
- [ICLR 2022] "Sparsity Winning Twice: Better Robust Generalization from More Efficient Training" by Tianlong Chen*, Zhenyu Zhang*, Pengjun…☆39Updated 3 years ago
- [CVPR 2024] DiffAgent: Fast and Accurate Text-to-Image API Selection with Large Language Model☆17Updated last year
- Dataset Interfaces: Diagnosing Model Failures Using Controllable Counterfactual Generation☆45Updated 2 years ago
- This is a PyTorch implementation of the paperViP A Differentially Private Foundation Model for Computer Vision☆36Updated 2 years ago