claim
Token pressure causes large language models to hallucinate because, when forced to generate long or elaborate responses, the model may invent details to maintain fluency and coherence.
Authors
Sources
- The Role of Hallucinations in Large Language Models - CloudThat www.cloudthat.com via serper
Referenced by nodes (3)
- Large Language Models concept
- hallucination concept
- coherence concept