claim
Existing hallucination detection methods that utilize open-source LLMs like GPT-API lack appropriate medical domain knowledge, rely solely on textual evaluation, and fail to incorporate image inputs.

Authors

Sources

Referenced by nodes (2)