claim
Grounded pretraining reduces hallucination during generation in large language models, though it requires significant data and compute resources.

Authors

Sources

Referenced by nodes (2)