pipilurj / DynaFedLinks
☆50Updated 2 years ago
Alternatives and similar repositories for DynaFed
Users that are interested in DynaFed are comparing it to the libraries listed below
Sorting:
- ☆86Updated 2 years ago
- ☆27Updated 2 years ago
- Efficient Dataset Distillation by Representative Matching☆113Updated last year
- ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching☆101Updated last year
- The official repository for paper "MLLM-Protector: Ensuring MLLM’s Safety without Hurting Performance"☆37Updated last year
- [ICLR2023] Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning (https://arxiv.org/abs/2210.0022…☆40Updated 2 years ago
- [ICLR 2023] Trainable Weight Averaging: Efficient Training by Optimizing Historical Solutions☆27Updated 4 months ago
- The official code for ICML 2024 "FedREDefense: Defending against Model Poisoning Attacks for Federated Learning using Model Update Recons…☆25Updated last year
- [TPAMI 2023] Low Dimensional Landscape Hypothesis is True: DNNs can be Trained in Tiny Subspaces☆42Updated 2 years ago
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆71Updated 2 years ago
- Official Code for Dataset Distillation using Neural Feature Regression (NeurIPS 2022)☆47Updated 2 years ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆39Updated 2 years ago
- COALA: A Practical and Vision-Centric Federated Learning Platform, accepted to ICML'24☆118Updated 7 months ago
- ☆65Updated 2 years ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆71Updated 4 months ago
- translation of VHL repo in paddle☆25Updated 2 years ago
- [ICLR 2023] "Combating Exacerbated Heterogeneity for Robust Models in Federated Learning"☆32Updated 2 years ago
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆72Updated 3 years ago
- Implementaiton of "DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation" (accepted by NAACL2024 Findings)".☆21Updated 4 months ago
- IDEAL: Influence-Driven Selective Annotations Empower In-Context Learners in Large Language Models☆59Updated last year
- Prioritize Alignment in Dataset Distillation☆20Updated 6 months ago
- This is a method of dataset condensation, and it has been accepted by CVPR-2022.☆69Updated last year
- Code repository is for "Federated Composite Optimization", to appear in ICML 2021☆12Updated 3 years ago
- [CVPR2024 highlight] Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching (G-VBSM)☆28Updated 8 months ago
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆21Updated 3 years ago
- ☆99Updated 10 months ago
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆113Updated last year
- Data-Free Knowledge Distillation☆21Updated 3 years ago
- ☆29Updated 2 years ago
- ☆39Updated 2 years ago