[ICLR 2025] Mitigating Modality Prior-Induced Hallucinations in Multimodal Large Language Models via Deciphering Attention Causality
☆65Jul 5, 2025Updated 9 months ago
Alternatives and similar repositories for CausalMM
Users that are interested in CausalMM are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Latest Advances on Modality Priors in Multimodal Large Language Models☆31Dec 10, 2025Updated 4 months ago
- [ICML 2025] Official implementation of paper 'Look Twice Before You Answer: Memory-Space Visual Retracing for Hallucination Mitigation in…☆171Sep 25, 2025Updated 6 months ago
- Code for paper: Visual Signal Enhancement for Object Hallucination Mitigation in Multimodal Large language Models