claim
In large language models, small initial errors propagate forward without correction, leading to a gradual divergence from the true distribution that never fully resolves.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (1)
- Large Language Models concept