claim
Large language models may lack a reliable internal representation of their own knowledge cutoff, leading them to conflate the current date with the period they know most about and treat outdated information as current.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (1)
- Large Language Models concept