claim
Grounded pretraining and fine-tuning improves factual consistency by integrating knowledge sources or fact-labeled datasets during pretraining or fine-tuning stages, as noted by Zhang et al. (2023).
Authors
Sources
- Survey and analysis of hallucinations in large language models www.frontiersin.org via serper
Referenced by nodes (2)
- fine-tuning concept
- factual consistency evaluation concept