claim
Medical hallucinations in Large Language Models (LLMs) pose serious risks because incorrect dosages, drug interactions, or diagnostic criteria can lead to life-threatening outcomes.
Authors
Sources
- Medical Hallucination in Foundation Models and Their ... www.medrxiv.org via serper
Referenced by nodes (2)
- Large Language Models concept
- medical hallucination concept