claim
A hallucination in an LLM-based GPT occurs when the model generates a response that appears realistic but is nonfactual, nonsensical, or inconsistent with the provided input.
Authors
Sources
- GPTs and Hallucination - Communications of the ACM cacm.acm.org via serper
Referenced by nodes (1)
- hallucination concept