procedure
For Questions 2 and 3 in the evaluation, hallucination is detected if the LLM generates an answer indicating an abnormal event at an incorrect time, produces an answer not present in the provided data, or, in the case of the KG-IRAG system, if the second LLM (LLM2) fails to decide when to stop the exploration process.
Authors
Sources
- KG-IRAG: A Knowledge Graph-Based Iterative Retrieval-Augmented ... arxiv.org via serper
Referenced by nodes (1)
- KG-IRAG concept