claim
The Survey of Hallucination in Natural Language Generation defines extrinsic hallucination as a case where the generated output cannot be verified from the source content, and intrinsic hallucination as a case where the generated output contradicts the source content.
Authors
Sources
- EdinburghNLP/awesome-hallucination-detection - GitHub github.com via serper
Referenced by nodes (2)
- extrinsic hallucination concept
- natural language generation concept