claim
Inadequate training data coverage creates knowledge gaps that cause large language models to hallucinate when addressing unfamiliar medical topics, according to Lee et al. (2024).
Authors
Sources
- Medical Hallucination in Foundation Models and Their ... www.medrxiv.org via serper
- Medical Hallucination in Foundation Models and Their Impact on ... www.medrxiv.org via serper
Referenced by nodes (3)
- Large Language Models concept
- training data concept
- knowledge gaps concept