claim
Token pressure causes large language models to hallucinate because, when forced to generate long or elaborate responses, the model may invent details to maintain fluency and coherence.

Authors

Sources

Referenced by nodes (3)