claim
Overrepresentation of specific patterns in training data, such as lesions frequently occurring in the liver, can cause generative AI models to erroneously hallucinate those features in test samples where they do not exist.
Authors
Sources
- On Hallucinations in Artificial Intelligence–Generated Content ... jnm.snmjournals.org via serper
Referenced by nodes (2)
- hallucination concept
- generative artificial intelligence concept