claim
Retrieval-Augmented Generation (RAG) reduces hallucinations by grounding responses in external knowledge sources, though it can introduce new hallucinations through poor retrieval quality, context overflow, or misaligned reranking.
Authors
Sources
- LLM Hallucination Detection and Mitigation: State of the Art in 2026 zylos.ai via serper
- Reducing hallucinations in large language models with custom ... aws.amazon.com via serper
- The Role of Hallucinations in Large Language Models - CloudThat www.cloudthat.com via serper
Referenced by nodes (3)
- hallucination concept
- Retrieval-Augmented Generation (RAG) concept
- external knowledge base concept