OPTML-Group / DP4TLLinks
[NeurIPS2023] "Selectivity Drives Productivity: Efficient Dataset Pruning for Enhanced Transfer Learning" by Yihua Zhang*, Yimeng Zhang*, Aochuan Chen*, Jinghan Jia, Jiancheng Liu, Gaowen Liu, Mingyi Hong, Shiyu Chang, Sijia Liu
☆13Updated last year
Alternatives and similar repositories for DP4TL
Users that are interested in DP4TL are comparing it to the libraries listed below
Sorting:
- (ICML 2023) Discover and Cure: Concept-aware Mitigation of Spurious Correlation☆41Updated last year
- [ICLR 2025] "Rethinking LLM Unlearning Objectives: A Gradient Perspective and Go Beyond"☆12Updated 6 months ago
- Official PyTorch implementation of "Multisize Dataset Condensation" (ICLR'24 Oral)☆14Updated last year
- ☆26Updated 2 years ago
- ☆15Updated last year
- ☆86Updated 2 years ago
- Code for ICLR 2023 Harnessing Out-Of-Distribution Examples via Augmenting Content and Style☆13Updated 2 years ago
- [ICLR 2023, ICLR DG oral] PAIR, the optimizer and model selection criteria for OOD Generalization☆52Updated last year
- Mitigating Spurious Correlations in Multi-modal Models during Fine-tuning (ICML 2023)☆18Updated last year
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Updated 2 years ago
- Implementation of Concept-level Debugging of Part-Prototype Networks☆12Updated 2 years ago
- Official PyTorch implementation of "Loss-Curvature Matching for Dataset Selection and Condensation" (AISTATS 2023)☆21Updated 2 years ago
- ☆65Updated 10 months ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆73Updated 6 months ago
- [NeurIPS23 (Spotlight)] "Model Sparsity Can Simplify Machine Unlearning" by Jinghan Jia*, Jiancheng Liu*, Parikshit Ram, Yuguang Yao, Gao…☆77Updated last year
- translation of VHL repo in paddle☆25Updated 2 years ago
- Respect to the input tensor instead of paramters of NN☆21Updated 3 years ago
- ☆15Updated last year
- Official PyTorch implementation for Frequency Domain-based Dataset Distillation [NeurIPS 2023]☆30Updated last year
- AAAI 2024, M3D: Dataset Condensation by Minimizing Maximum Mean Discrepancy☆25Updated last year
- ☆29Updated last year
- [ICCV 2023] DataDAM: Efficient Dataset Distillation with Attention Matching☆34Updated last year
- Repository for research works and resources related to model reprogramming <https://arxiv.org/abs/2202.10629>☆61Updated last year
- Official Code for ICLR2022 Paper: Chaos is a Ladder: A New Theoretical Understanding of Contrastive Learning via Augmentation Overlap☆28Updated 2 years ago
- [ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baha…☆14Updated last year
- ☆24Updated 2 years ago
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆22Updated 3 years ago
- Representation Surgery for Multi-Task Model Merging. ICML, 2024.☆46Updated 10 months ago
- ☆42Updated last year
- Code for paper "Parameter Efficient Multi-task Model Fusion with Partial Linearization"☆21Updated 11 months ago