Guang000 / Awesome-Dataset-DistillationLinks
A curated list of awesome papers on dataset distillation and related applications.
☆1,750Updated this week
Alternatives and similar repositories for Awesome-Dataset-Distillation
Users that are interested in Awesome-Dataset-Distillation are comparing it to the libraries listed below
Sorting:
- Dataset Condensation (ICLR21 and ICML21)☆521Updated last year
- Collection of awesome test-time (domain/batch/instance) adaptation methods☆1,042Updated 2 weeks ago
- Official code for our CVPR '22 paper "Dataset Distillation by Matching Training Trajectories"☆427Updated last year
- Some Conferences' accepted paper lists (including AI, ML, Robotic)☆1,264Updated 6 months ago
- 2024 up-to-date list of DATASETS, CODEBASES and PAPERS on Multi-Task Learning (MTL), from Machine Learning perspective.☆776Updated 2 months ago
- A curated list of prompt-based paper in computer vision and vision-language learning.☆921Updated last year
- A collection of literature after or concurrent with Masked Autoencoder (MAE) (Kaiming He el al.).☆843Updated last year
- Awesome Machine Unlearning (A Survey of Machine Unlearning)☆862Updated this week
- ❄️🔥 Visual Prompt Tuning [ECCV 2022] https://arxiv.org/abs/2203.12119☆1,143Updated last year
- A comprehensive list of awesome contrastive self-supervised learning papers.☆1,284Updated 10 months ago
- A PyTorch toolbox for domain generalization, domain adaptation and semi-supervised learning.☆1,353Updated last year
- 关于domain generalization,domain adaptation,causality,robutness,prompt,optimization,generative model各式各样研究的 阅读笔记☆1,226Updated last year
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,537Updated last week
- Open-source code for paper "Dataset Distillation"☆811Updated last month
- A Unified Semi-Supervised Learning Codebase (NeurIPS'22)☆1,508Updated last month
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,710Updated 3 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆866Updated last year
- Test-time Adaptation, Test-time Training and Source-free Domain Adaptation☆509Updated last year
- Prompt Learning for Vision-Language Models (IJCV'22, CVPR'22)☆2,027Updated last year
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,617Updated 2 years ago
- A PyTorch Library for Multi-Task Learning☆2,379Updated 2 months ago
- Existing Literature about Machine Unlearning☆890Updated last month
- A collection of papers on the topic of ``Computer Vision in the Wild (CVinW)''☆1,322Updated last year
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,359Updated last year
- A collection of AWESOME things about mixture-of-experts☆1,181Updated 8 months ago
- An up-to-date list of works on Multi-Task Learning☆363Updated last month
- DomainBed is a suite to test domain generalization algorithms☆1,532Updated 7 months ago
- Awesome list for research on CLIP (Contrastive Language-Image Pre-Training).