procedure
Human oversight as a mitigation strategy for large language model hallucinations involves implementing fact-checking processes and involving human evaluators.
Authors
Sources
- LLM Hallucinations: Causes, Consequences, Prevention - LLMs llmmodels.org via serper
Referenced by nodes (2)
- large language model hallucination concept
- fact-checking concept