claim
Hallucination detection involves checking the factuality of LLM-generated responses against a set of references, which requires addressing three questions: how and where to find references, the level of detail for checking responses, and how to categorize claims in the responses.

Authors

Sources

Referenced by nodes (1)