claim
Retrieval-Augmented Generation (RAG) systems are prone to hallucinations, where the generated content is not grounded in the provided context or is factually incorrect.
Authors
Sources
- Detect hallucinations for RAG-based systems - AWS aws.amazon.com via serper
Referenced by nodes (2)
- hallucination concept
- Retrieval-Augmented Generation (RAG) concept