claim
Overrepresentation of specific patterns in training data, such as lesions frequently occurring in the liver, can cause generative AI models to erroneously hallucinate those features in test samples where they do not exist.

Authors

Sources

Referenced by nodes (2)