Hritikbansal / videophyLinks
Video Generation, Physical Commonsense, Semantic Adherence, VideoCon-Physics
☆129Updated 3 months ago
Alternatives and similar repositories for videophy
Users that are interested in videophy are comparing it to the libraries listed below
Sorting:
- Code release for "PISA Experiments: Exploring Physics Post-Training for Video Diffusion Models by Watching Stuff Drop" (ICML 2025)☆39Updated 2 months ago
- [Neurips 2024] Video Diffusion Models are Training-free Motion Interpreter and Controller☆45Updated 3 months ago
- [ICML2025] The code and data of Paper: Towards World Simulator: Crafting Physical Commonsense-Based Benchmark for Video Generation☆116Updated 9 months ago
- official repo for "VideoScore: Building Automatic Metrics to Simulate Fine-grained Human Feedback for Video Generation" [EMNLP2024]☆94Updated 5 months ago
- [ICLR 2024] LLM-grounded Video Diffusion Models (LVD): official implementation for the LVD paper☆155Updated last year
- Official Implementation of Paper Transfer between Modalities with MetaQueries☆186Updated 3 weeks ago
- A list of works on video generation towards world model☆161Updated this week
- official code repo of CVPR 2025 paper PhyT2V: LLM-Guided Iterative Self-Refinement for Physics-Grounded Text-to-Video Generation☆40Updated this week
- VideoREPA: Learning Physics for Video Generation through Relational Alignment with Foundation Models☆54Updated 2 months ago
- PyTorch implementation of DiffMoE, TC-DiT, EC-DiT and Dense DiT☆121Updated 3 months ago
- Code for: "Long-Context Autoregressive Video Modeling with Next-Frame Prediction"