OPTML-Group / DP4TLLinks
[NeurIPS2023] "Selectivity Drives Productivity: Efficient Dataset Pruning for Enhanced Transfer Learning" by Yihua Zhang*, Yimeng Zhang*, Aochuan Chen*, Jinghan Jia, Jiancheng Liu, Gaowen Liu, Mingyi Hong, Shiyu Chang, Sijia Liu
☆14Updated 2 years ago
Alternatives and similar repositories for DP4TL
Users that are interested in DP4TL are comparing it to the libraries listed below
Sorting:
- ☆15Updated last year
- (ICML 2023) Discover and Cure: Concept-aware Mitigation of Spurious Correlation☆42Updated last year
- Official PyTorch implementation of "Multisize Dataset Condensation" (ICLR'24 Oral)☆15Updated last year
- Official PyTorch implementation of "Loss-Curvature Matching for Dataset Selection and Condensation" (AISTATS 2023)☆22Updated 2 years ago
- ☆88Updated 2 years ago
- [ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baha…☆16Updated last year
- AAAI 2024, M3D: Dataset Condensation by Minimizing Maximum Mean Discrepancy☆25Updated last year
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆22Updated 3 years ago
- ☆29Updated last year
- ☆28Updated 2 years ago
- ☆14Updated 2 years ago
- A Task of Fictitious Unlearning for VLMs☆23Updated 7 months ago
- ICLR 2022 (Spolight): Continual Learning With Filter Atom Swapping☆16Updated 2 years ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Updated 2 years ago
- Respect to the input tensor instead of paramters of NN☆21Updated 3 years ago
- Code for ICLR 2023 Harnessing Out-Of-Distribution Examples via Augmenting Content and Style☆13Updated 2 years ago
- ☆39Updated 2 years ago
- [ICLR 2025] "Rethinking LLM Unlearning Objectives: A Gradient Perspective and Go Beyond"☆13Updated 8 months ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆76Updated 8 months ago
- ☆26Updated 2 years ago
- Code for paper "Parameter Efficient Multi-task Model Fusion with Partial Linearization"☆23Updated last year
- ☆15Updated 8 months ago
- Official PyTorch implementation for Frequency Domain-based Dataset Distillation [NeurIPS 2023]☆30Updated last year
- Repository for research works and resources related to model reprogramming <https://arxiv.org/abs/2202.10629>☆64Updated last month
- Mitigating Spurious Correlations in Multi-modal Models during Fine-tuning (ICML 2023)☆19Updated last year
- [NeurIPS23 (Spotlight)] "Model Sparsity Can Simplify Machine Unlearning" by Jinghan Jia*, Jiancheng Liu*, Parikshit Ram, Yuguang Yao, Gao…☆81Updated last year
- Code for the paper "Efficient Dataset Distillation using Random Feature Approximation"☆37Updated 2 years ago
- Representation Surgery for Multi-Task Model Merging. ICML, 2024.☆46Updated last year
- AdaMerging: Adaptive Model Merging for Multi-Task Learning. ICLR, 2024.☆95Updated last year
- [ICLR 2023, ICLR DG oral] PAIR, the optimizer and model selection criteria for OOD Generalization☆53Updated last year