measurement
Even with current state-of-the-art Large Vision Language Models, at least 30% of hallucinatory text exists in the form of nonexistent objects, unfaithful descriptions, and inaccurate relationships.
Authors
Sources
- Detecting and Evaluating Medical Hallucinations in Large Vision ... arxiv.org via serper
Referenced by nodes (1)
- Large Vision-Language Models concept