claim
Introducing a "not sure" category in Large Language Model hallucination detection improves precision by allowing models to abstain from decisions when uncertainty is high.
Authors
Sources
- MedHallu: Benchmark for Medical LLM Hallucination Detection www.emergentmind.com via serper
Referenced by nodes (1)
- hallucination detection concept