claim
The generation process in large language models introduces pressure to favor fluent hallucination over honest uncertainty because the process is a sequence of probability distributions where the model must select a token at each step, and the model lacks a mechanism to output 'I don't know'.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (3)
- Large Language Models concept
- hallucination concept
- probability distribution concept