claim
Pre-training contributes to LLM hallucinations because the objective of density estimation forces the model to make confident guesses even when it encounters information it has not learned.
Authors
Sources
- What Really Causes Hallucinations in LLMs? - AI Exploration Journey aiexpjourney.substack.com via serper
Referenced by nodes (2)
- LLM hallucinations in medicine concept
- Pre-training concept