claim
Medical annotation for evaluating AI hallucinations is often limited by the time doctors have to assess each AI-generated output, making it difficult to distinguish between clear AI hallucinations and potentially useful, unconventional diagnoses.
Authors
Sources
- Medical Hallucination in Foundation Models and Their ... www.medrxiv.org via serper
- Medical Hallucination in Foundation Models and Their Impact on ... www.medrxiv.org via serper
Referenced by nodes (1)
- AI hallucinations concept