Relations (1)

related 2.58 — strongly supporting 5 facts

Hallucination is defined in the KG-RAG framework as the presence of information not found in the ground truth [1], [2], and is measured by comparing predicted answers against the ground truth to identify token mismatches [3], [4]. Furthermore, research indicates that the semantic proximity of a response to the ground truth influences the detectability of hallucinations [5].

Facts (5)

Sources
KG-RAG: Bridging the Gap Between Knowledge and Creativity - arXiv arxiv.org arXiv 4 facts
claimA hallucination score of '1' in the KG-RAG evaluation framework indicates a hallucinated response, determined by the absence of perfect precision (token mismatch between predicted and ground truth answers) and the presence of specific heuristic indicators, such as phrases like 'I don’t know'.
formulaHallucination in the KG-RAG evaluation framework is defined as responses containing information not present in the ground truth, and it is calculated using the formula: Hallucination Rate = (1/N) * Σ(1 if predicted answer is not perfect precision AND contains heuristic indicators of uncertainty, else 0), where N is the number of samples.
claimA hallucination score of '1' in the KG-RAG evaluation framework indicates a hallucinated response, determined by the absence of perfect precision (token mismatch between predicted and ground truth answers) and the presence of specific heuristic indicators, such as phrases like 'I don’t know'.
formulaHallucination in the KG-RAG evaluation framework is defined as responses containing information not present in the ground truth, and it is calculated using the formula: Hallucination Rate = (1/N) * Σ(1 if predicted answer is not perfect precision AND contains heuristic indicators of uncertainty, else 0), where N is the number of samples.
[2502.14302] MedHallu: A Comprehensive Benchmark for Detecting ... arxiv.org arXiv 1 fact
claimUsing bidirectional entailment clustering, the authors of the MedHallu paper demonstrated that harder-to-detect hallucinations are semantically closer to ground truth.