claim
Medical hallucinations are defined as factually incorrect yet plausible outputs with medical relevance generated by AI/LLM tools.

Authors

Sources

Referenced by nodes (2)