claim
When large language models are queried about tail entities, they face a difficult inference problem because they lack consistent exposures and instead generalize from surface-level patterns to predict the form of the answer.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (1)
- Large Language Models concept