claim
Large language models that are better at following instructions and producing fluent prose may hallucinate at similar rates as simpler models on tail entities, but produce more convincing hallucinations.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (2)
- Large Language Models concept
- hallucination concept