claim
Medical hallucinations in LLMs can manifest as incorrect diagnoses, the use of confusing or inappropriate medical terminology, or the presentation of contradictory findings within a patient’s case.
Authors
Sources
- Medical Hallucination in Foundation Models and Their ... www.medrxiv.org via serper
- Medical Hallucination in Foundation Models and Their Impact on ... www.medrxiv.org via serper
Referenced by nodes (3)
- Large Language Models concept
- medical hallucination concept
- diagnosis concept