Relations (1)
related 13.00 — strongly supporting 13 facts
Justification not yet generated — showing supporting facts
- Medical hallucination is defined as any instance in which a foundation model generates misleading medical content.
- The authors of the study define medical hallucination as a reasoning-driven failure mode of foundation models that is distinct from general hallucinations in both its origin and clinical consequence.
- The authors of 'Medical Hallucination in Foundation Models and Their ...' contributed a taxonomy for understanding and addressing medical hallucinations, benchmarked models using a medical hallucination dataset and physician-annotated LLM responses to real medical cases, and conducted a multi-national clinician survey on experiences with medical hallucinations.
- The authors of the study 'Medical Hallucination in Foundation Models and Their Impact on ...' define medical hallucination as any model-generated output that is factually incorrect, logically inconsistent, or unsupported by authoritative clinical evidence in ways that could alter clinical decisions.
- The study's empirical evaluation, utilizing a physician-audited benchmark, indicates that most medical hallucinations in foundation models stem from failures in causal and temporal reasoning rather than missing medical knowledge.
- Structured prompting and retrieval-augmented generation can reduce medical hallucinations in foundation models by over 10%, according to the study's empirical evaluation.
- The authors define medical hallucination in Foundation Models as a distinct concept from general hallucinations, characterized by unique risks within the healthcare domain.
- In an evaluation of 11 foundation models (7 general-purpose, 4 medical-specialized) across seven medical hallucination tasks, general-purpose models achieved a median of 76.6% hallucination-free responses, while medical-specialized models achieved a median of 51.3%.
- Medical hallucinations in foundation models manifest as misordered symptom progression, flawed diagnostic logic, or misplaced causal inference, and these errors persist even in large-scale models.
- The causes of medical hallucinations in Foundation Models are driven by data quality, model limitations, and healthcare domain complexities.
- Medical hallucinations in Foundation Models are categorized into a taxonomy ranging from factual inaccuracies to complex reasoning errors.
- Detection and mitigation strategies for medical hallucinations in Foundation Models include factual verification, consistency checks, uncertainty quantification, and prompt engineering.
- The taxonomy of medical hallucinations in foundation models clusters errors into five main categories: factual errors, outdated references, spurious correlations, incomplete chains of reasoning, and fabricated sources or guidelines.
Facts (13)
Sources
Medical Hallucination in Foundation Models and Their ... medrxiv.org 7 facts
claimMedical hallucination is defined as any instance in which a foundation model generates misleading medical content.
claimThe authors of 'Medical Hallucination in Foundation Models and Their ...' contributed a taxonomy for understanding and addressing medical hallucinations, benchmarked models using a medical hallucination dataset and physician-annotated LLM responses to real medical cases, and conducted a multi-national clinician survey on experiences with medical hallucinations.
claimThe authors define medical hallucination in Foundation Models as a distinct concept from general hallucinations, characterized by unique risks within the healthcare domain.
claimThe causes of medical hallucinations in Foundation Models are driven by data quality, model limitations, and healthcare domain complexities.
claimMedical hallucinations in Foundation Models are categorized into a taxonomy ranging from factual inaccuracies to complex reasoning errors.
claimDetection and mitigation strategies for medical hallucinations in Foundation Models include factual verification, consistency checks, uncertainty quantification, and prompt engineering.
claimThe taxonomy of medical hallucinations in foundation models clusters errors into five main categories: factual errors, outdated references, spurious correlations, incomplete chains of reasoning, and fabricated sources or guidelines.
Medical Hallucination in Foundation Models and Their Impact on ... medrxiv.org 6 facts
claimThe authors of the study define medical hallucination as a reasoning-driven failure mode of foundation models that is distinct from general hallucinations in both its origin and clinical consequence.
claimThe authors of the study 'Medical Hallucination in Foundation Models and Their Impact on ...' define medical hallucination as any model-generated output that is factually incorrect, logically inconsistent, or unsupported by authoritative clinical evidence in ways that could alter clinical decisions.
claimThe study's empirical evaluation, utilizing a physician-audited benchmark, indicates that most medical hallucinations in foundation models stem from failures in causal and temporal reasoning rather than missing medical knowledge.
measurementStructured prompting and retrieval-augmented generation can reduce medical hallucinations in foundation models by over 10%, according to the study's empirical evaluation.
measurementIn an evaluation of 11 foundation models (7 general-purpose, 4 medical-specialized) across seven medical hallucination tasks, general-purpose models achieved a median of 76.6% hallucination-free responses, while medical-specialized models achieved a median of 51.3%.
claimMedical hallucinations in foundation models manifest as misordered symptom progression, flawed diagnostic logic, or misplaced causal inference, and these errors persist even in large-scale models.