claim
Generation pressure causes hallucinations because the always-generate objective, overconfident priors learned from confident web content, prompt-answer alignment bias, and decoding artifacts cause the model to generate confident assertions regardless of its actual knowledge state.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (1)
- hallucination concept