claim
Retrieval-augmented generation reduces hallucination for tail entities by providing factual grounding in the model's context window, allowing the model to utilize its in-context reasoning ability even when its parametric knowledge of the entity is weak.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (2)
- Large Language Models concept
- Retrieval-Augmented Generation (RAG) concept