procedure
Self-refinement mitigates LLM hallucinations by having the model review and adjust its own output before presenting the final response.
Authors
Sources
- Hallucinations in LLMs: Can You Even Measure the Problem? www.linkedin.com via serper
Referenced by nodes (1)
- LLM hallucinations in medicine concept