claim
Prompting-induced hallucinations occur when prompts are vague, underspecified, or structurally misleading, which pushes the model into speculative generation, as noted by Reynolds and McDonell (2021), Wei et al. (2022), and Zhou et al. (2022).
Authors
Sources
- Survey and analysis of hallucinations in large language models www.frontiersin.org via serper
Referenced by nodes (1)
- prompt-induced hallucination concept