claim
Large Language Models often lack exposure to rare diseases during training, which leads to hallucinations when the models generate diagnostic insights for those conditions.
Authors
Sources
- Medical Hallucination in Foundation Models and Their Impact on ... www.medrxiv.org via serper
Referenced by nodes (1)
- Large Language Models concept