Relations (1)

related 2.58 — strongly supporting 5 facts

Foundation models are increasingly integrated into health care for clinical decision support, research, and system operations as described in [1] and [2]. Furthermore, the application of these models in the medical field introduces specific challenges, such as medical hallucinations, which are defined and evaluated within the context of the health care domain in [3], [4], and [5].

Facts (5)

Sources
Medical Hallucination in Foundation Models and Their ... medrxiv.org medRxiv 3 facts
claimThe authors define medical hallucination in Foundation Models as a distinct concept from general hallucinations, characterized by unique risks within the healthcare domain.
claimThe causes of medical hallucinations in Foundation Models are driven by data quality, model limitations, and healthcare domain complexities.
claimFoundation models, including Large Language Models (LLM) and Large Vision Language Models (VLM), are used in healthcare for clinical decision support, medical research, and improving healthcare quality and safety.
Medical Hallucination in Foundation Models and Their Impact on ... medrxiv.org medRxiv 2 facts
claimFoundation models are increasingly used in healthcare for clinical decision support, medical research, and health-system operations.
claimThe study evaluated a diverse set of foundation models, including both general-purpose models and medical-purpose models designed or fine-tuned for healthcare applications, to assess medical hallucinations.