reference
The causes of LLM hallucinations include flawed training data (biases, inaccuracies, or inconsistencies), knowledge gaps (lack of domain-specific knowledge or context understanding), and technical limitations (over-reliance on statistical patterns and vulnerability to manipulation).
Authors
Sources
- LLM Hallucinations: Causes, Consequences, Prevention - LLMs llmmodels.org via serper
Referenced by nodes (3)
- LLM hallucinations in medicine concept
- training data concept
- knowledge gaps concept