reference
Azaria and Mitchell (2023) published 'The internal state of an llm knows when it’s lying' in EMNLP Findings, exploring the use of internal states for detecting falsehoods.
Authors
Sources
- Awesome-Hallucination-Detection-and-Mitigation - GitHub github.com via serper
Referenced by nodes (1)
- hallucination detection concept