reference
Ji et al. (2024) published 'LLM Internal States Reveal Hallucination Risk Faced With a Query' on Arxiv.
Authors
Sources
- Awesome-Hallucination-Detection-and-Mitigation - GitHub github.com via serper
Referenced by nodes (1)
- hallucination detection concept