evinpinar / Attend-and-Excite-diffusersLinks
☆12Updated 2 years ago
Alternatives and similar repositories for Attend-and-Excite-diffusers
Users that are interested in Attend-and-Excite-diffusers are comparing it to the libraries listed below
Sorting:
- Implementation of P+: Extended Textual Conditioning in Text-to-Image Generation☆49Updated 2 years ago
- ☆79Updated last year
- ☆123Updated last year
- Official GitHub repository for the Text-Guided Video Editing (TGVE) competition of LOVEU Workshop @ CVPR'23.☆77Updated 2 years ago
- Official implementation of the paper "ProSpect: Prompt Spectrum for Attribute-Aware Personalization of Diffusion Models"(SIGGRAPH Asia 20…☆142Updated last year
- ☆24Updated 2 years ago
- CVPR-24 | Official codebase for ZONE: Zero-shot InstructiON-guided Local Editing☆82Updated last year
- An unofficial implement of DiffEdit on stable-diffusion☆82Updated 3 years ago
- Official implementation for "LOVECon: Text-driven Training-free Long Video Editing with ControlNet"☆43Updated 2 years ago
- ☆64Updated 2 years ago
- code for paper "Compositional Text-to-Image Synthesis with Attention Map Control of Diffusion Models"☆45Updated 2 years ago
- [CVPR`2024, Oral] Attention Calibration for Disentangled Text-to-Image Personalization☆108Updated last year
- ☆51Updated last year
- Official Implementation for "A Neural Space-Time Representation for Text-to-Image Personalization" (SIGGRAPH Asia 2023)☆181Updated 2 years ago
- ☆13Updated last year
- ICCV2023-Diffusion-Papers☆108Updated 2 years ago
- ☆40Updated 11 months ago
- This repo contains the code for PreciseControl project [ECCV'24]☆69Updated last year
- [ECCV2024] StoryImager: A Unified and Efficient Framework for Coherent Story Visualization and Completion☆40Updated last year
- ☆24Updated 2 years ago
- [NeurIPS 2022] Official pytorch implementation of "Towards Diverse and Faithful One-shot Domain Adaption of Generative Adversarial Networ…☆50Updated 3 years ago
- ☆14Updated 2 years ago
- Directed Diffusion: Direct Control of Object Placement through Attention Guidance (AAAI2024)