claim
Pre-training contributes to LLM hallucinations because the objective of density estimation forces the model to make confident guesses even when it encounters information it has not learned.

Authors

Sources

Referenced by nodes (2)