OPTML-Group / DP4TLLinks
[NeurIPS2023] "Selectivity Drives Productivity: Efficient Dataset Pruning for Enhanced Transfer Learning" by Yihua Zhang*, Yimeng Zhang*, Aochuan Chen*, Jinghan Jia, Jiancheng Liu, Gaowen Liu, Mingyi Hong, Shiyu Chang, Sijia Liu
☆14Updated 2 years ago
Alternatives and similar repositories for DP4TL
Users that are interested in DP4TL are comparing it to the libraries listed below
Sorting:
- (ICML 2023) Discover and Cure: Concept-aware Mitigation of Spurious Correlation☆43Updated last month
- [ICLR 2025] "Rethinking LLM Unlearning Objectives: A Gradient Perspective and Go Beyond"☆15Updated 10 months ago
- ☆15Updated last year
- Official PyTorch implementation of "Multisize Dataset Condensation" (ICLR'24 Oral)☆15Updated last year
- Respect to the input tensor instead of paramters of NN☆21Updated 3 years ago
- [SatML 2024] Shake to Leak: Fine-tuning Diffusion Models Can Amplify the Generative Privacy Risk☆16Updated 10 months ago
- ☆89Updated 2 years ago
- A Task of Fictitious Unlearning for VLMs☆28Updated 9 months ago
- ☆26Updated 2 years ago
- Mitigating Spurious Correlations in Multi-modal Models during Fine-tuning (ICML 2023)☆19Updated 2 years ago
- ☆30Updated last year
- official code repo for paper "Merging Models on the Fly Without Retraining: A Sequential Approach to Scalable Continual Model Merging"☆22Updated 3 months ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Updated 2 years ago
- Representation Surgery for Multi-Task Model Merging. ICML, 2024.☆47Updated last year
- AAAI 2024, M3D: Dataset Condensation by Minimizing Maximum Mean Discrepancy☆25Updated last year
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆22Updated 3 years ago
- Code for paper "Parameter Efficient Multi-task Model Fusion with Partial Linearization"☆24Updated last year
- Official repository of "Localizing Task Information for Improved Model Merging and Compression" [ICML 2024]☆51Updated 3 weeks ago
- [ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baha…☆15Updated last year
- [NeurIPS23 (Spotlight)] "Model Sparsity Can Simplify Machine Unlearning" by Jinghan Jia*, Jiancheng Liu*, Parikshit Ram, Yuguang Yao, Gao…☆80Updated last year
- Official PyTorch implementation of "Loss-Curvature Matching for Dataset Selection and Condensation" (AISTATS 2023)☆22Updated 2 years ago
- AdaMerging: Adaptive Model Merging for Multi-Task Learning. ICLR, 2024.☆99Updated last year
- Code for the paper "Efficient Dataset Distillation using Random Feature Approximation"☆37Updated 2 years ago
- [NeurIPS 2023] Code release for "Going Beyond Linear Mode Connectivity: The Layerwise Linear Feature Connectivity"☆19Updated 2 years ago
- Code for ICLR 2023 Harnessing Out-Of-Distribution Examples via Augmenting Content and Style☆13Updated 2 years ago
- [ICCV 2023] DataDAM: Efficient Dataset Distillation with Attention Matching☆33Updated last year
- ☆28Updated 2 years ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆80Updated 10 months ago
- ☆19Updated 2 years ago
- Towards understanding modern generative data augmentation techniques.☆27Updated 2 years ago