procedure
Confident decoding mitigates LLM hallucinations by adjusting the decoding process to avoid low-probability outputs, which are more likely to be hallucinated.
Authors
Sources
- Hallucinations in LLMs: Can You Even Measure the Problem? www.linkedin.com via serper
Referenced by nodes (1)
- LLM hallucinations in medicine concept