claim
Large Language Models can hallucinate patient information, history, and symptoms on clinical notes, creating discrepancies that do not align with the original clinical notes.
Authors
Sources
- Medical Hallucination in Foundation Models and Their ... www.medrxiv.org via serper
Referenced by nodes (2)
- Large Language Models concept
- clinical notes concept