claim
Underrepresentation of specific pathologic scenarios in training data can cause generative AI models to produce synthesized artifacts that do not correspond to actual medical conditions when processing out-of-distribution samples.
Authors
Sources
- On Hallucinations in Artificial Intelligence–Generated Content ... jnm.snmjournals.org via serper