open-mmlab / Live2DiffLinks
Live2Diff: A Pipeline that processes Live video streams by a uni-directional video Diffusion model.
☆199Updated last year
Alternatives and similar repositories for Live2Diff
Users that are interested in Live2Diff are comparing it to the libraries listed below
Sorting:
- InteractiveVideo: User-Centric Controllable Video Generation with Synergistic Multimodal Instructions☆131Updated last year
- Video-Infinity generates long videos quickly using multiple GPUs without extra training.☆187Updated last year
- [CVPR 2025] Consistent and Controllable Image Animation with Motion Diffusion Models☆292Updated 6 months ago
- Official repo for DiffArtist (ACM MM 2025)☆124Updated 5 months ago
- The official implementation of ”RepVideo: Rethinking Cross-Layer Representation for Video Generation“☆123Updated 10 months ago
- MoMA: Multimodal LLM Adapter for Fast Personalized Image Generation☆234Updated last year
- Official implementation of MagicFace: Training-free Universal-Style Human Image Customized Synthesis.☆65Updated 11 months ago
- Keyframe Interpolation with CogvideoX