reference
Huang, L. et al. authored 'A Survey on Hallucination in Large Language Models: Principles, Taxonomy, Challenges, and Open Questions', published in 2024 (arXiv:2311.05232).
Authors
Sources
- A framework to assess clinical safety and hallucination rates of LLMs ... www.nature.com via serper
- Survey and analysis of hallucinations in large language models www.frontiersin.org via serper
Referenced by nodes (2)
- Large Language Models concept
- hallucination concept