claim
Large language models experience structural knowledge gaps, meaning they cannot know information that was not covered in their training data or was insufficiently represented to create reliable internal representations.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (1)
- Large Language Models concept