shikiw / OPERALinks
[CVPR 2024 Highlight] OPERA: Alleviating Hallucination in Multi-Modal Large Language Models via Over-Trust Penalty and Retrospection-Allocation
☆374Updated last year
Alternatives and similar repositories for OPERA
Users that are interested in OPERA are comparing it to the libraries listed below
Sorting:
- [CVPR 2024 Highlight] Mitigating Object Hallucinations in Large Vision-Language Models through Visual Contrastive Decoding☆319Updated 11 months ago
- [CVPR'24] HallusionBench: You See What You Think? Or You Think What You See? An Image-Context Reasoning Benchmark Challenging for GPT-4V(…☆299Updated 10 months ago
- The official GitHub page for ''Evaluating Object Hallucination in Large Vision-Language Models''