procedure
To address AI hallucinations, 85% (51) of survey respondents cross-reference with external sources, while others consult colleagues or experts (12), ignore erroneous outputs (11), cease use of the AI/LLM (11), inform the model of its mistake (1), update the prompt (1), rely on known correct answers (1), or examine underlying code (1).
Authors
Sources
- Medical Hallucination in Foundation Models and Their ... www.medrxiv.org via serper
Referenced by nodes (2)
- AI hallucinations concept
- artificial intelligence concept