claim
In the context of artificial intelligence, hallucination refers to a large language model generating information that appears confident and fluent, but is factually incorrect, fabricated, or unverifiable.
Authors
Sources
- The Role of Hallucinations in Large Language Models - CloudThat www.cloudthat.com via serper
Referenced by nodes (2)
- hallucination concept
- artificial intelligence concept