claim
LLMs can hallucinate patient information, history, or symptoms when generating or summarizing clinical notes, resulting in content that diverges from the source record.
Authors
Sources
- Medical Hallucination in Foundation Models and Their Impact on ... www.medrxiv.org via serper
Referenced by nodes (2)
- Large Language Models concept
- clinical notes concept