xiaoboxia / PICMMLinks
NeurIPS'2022: Pluralistic Image Completion with Gaussian Mixture Models
☆14Updated 2 years ago
Alternatives and similar repositories for PICMM
Users that are interested in PICMM are comparing it to the libraries listed below
Sorting:
- IDEAL: Influence-Driven Selective Annotations Empower In-Context Learners in Large Language Models☆59Updated last year
- Code for our ICML'24 on multimodal dataset distillation☆38Updated 10 months ago
- A pytorch implementation of CVPR24 paper "D4M: Dataset Distillation via Disentangled Diffusion Model"☆35Updated 11 months ago
- Uni-OVSeg is a weakly supervised open-vocabulary segmentation framework that leverages unpaired mask-text pairs.☆52Updated last year
- ☆29Updated last year
- [CVPR2024] Efficient Dataset Distillation via Minimax Diffusion☆96Updated last year
- ☆10Updated 2 months ago
- [CIKM-2024] Official code for work "ERASE: Error-Resilient Representation Learning on Graphs for Label Noise Tolerance"☆18Updated last year
- Diffusion-TTA improves pre-trained discriminative models such as image classifiers or segmentors using pre-trained generative models.☆74Updated last year
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆73Updated 6 months ago
- Official code implement of Robust Classification via a Single Diffusion Model☆91Updated 5 months ago
- ☆18Updated last year
- ☆42Updated last year
- Respect to the input tensor instead of paramters of NN☆21Updated 3 years ago
- Code of the paper: Finetuning Text-to-Image Diffusion Models for Fairness☆43Updated last year
- [NeurIPS 2023] Generalized Logit Adjustment☆38Updated last year
- Efficient Dataset Distillation by Representative Matching☆112Updated last year
- Official repo for ICT: Image-Object Cross-Level Trusted Intervention for Mitigating Object Hallucination in Large Vision-Language Models☆22Updated 5 months ago
- [CVPR2024 highlight] Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching (G-VBSM)☆29Updated 10 months ago
- Regularly Truncated M-estimators for Learning with Noisy Labels☆11Updated last year
- ☆16Updated last year
- Official PyTorch implementation for "Diffusion Models and Semi-Supervised Learners Benefit Mutually with Few Labels"☆95Updated last year
- Data distillation benchmark☆68Updated 2 months ago
- Official repository of "Back to Source: Diffusion-Driven Test-Time Adaptation"☆80Updated last year
- source code for NeurIPS'23 paper "Dream the Impossible: Outlier Imagination with Diffusion Models"☆69Updated 4 months ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Updated 2 years ago
- ☆29Updated 2 years ago
- [ICCV 2023 Oral] Official Implementation of "Denoising Diffusion Autoencoders are Unified Self-supervised Learners"☆177Updated last year
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆129Updated 9 months ago
- ☆113Updated last year