claim
Medical hallucinations in LLMs pose serious risks because incorrect medical information, such as dosages, drug interactions, or diagnostic criteria, can lead to life-threatening outcomes.
Authors
Sources
- Medical Hallucination in Foundation Models and Their Impact on ... www.medrxiv.org via serper
Referenced by nodes (2)
- Large Language Models concept
- medical hallucination concept