claim
Hegselmann et al. (2024b) state that inconsistencies in clinical datasets, such as electronic health records and physician notes, propagate errors into Large Language Model training.
Authors
Sources
- Medical Hallucination in Foundation Models and Their ... www.medrxiv.org via serper
Referenced by nodes (1)
- electronic health records concept