claim
Prompt-induced Hallucinations in medical LVLMs are induced by prompts containing confused information, often arising from a lack of plausibility or factuality in the prompt, and are used to test the model's robustness in specific contexts.
Authors
Sources
- Detecting and Evaluating Medical Hallucinations in Large Vision ... arxiv.org via serper
Referenced by nodes (1)
- prompt-induced hallucination concept