reference
Tonmoy, S. M. T. I. et al. authored 'A Comprehensive Survey of Hallucination Mitigation Techniques in Large Language Models', published in 2024 (arXiv:2401.01313).
Authors
Sources
- A framework to assess clinical safety and hallucination rates of LLMs ... www.nature.com via serper
Referenced by nodes (2)
- Large Language Models concept
- hallucination mitigation concept