claim
Medical hallucinations are defined as factually incorrect yet plausible outputs with medical relevance generated by AI/LLM tools.
Authors
Sources
- Medical Hallucination in Foundation Models and Their ... www.medrxiv.org via serper
Referenced by nodes (2)
- medical hallucination concept
- AI/LLM tools concept