claim
Reasoning models using Chain-of-Thought (CoT) hallucinate more than base models on complex factual questions because extended generation provides more surface area for factuality drift.
Authors
Sources
- EdinburghNLP/awesome-hallucination-detection - GitHub github.com via serper
Referenced by nodes (2)
- hallucination concept
- chain-of-thought concept