claim
Ongoing research to address LLM hallucinations includes techniques such as contrastive learning, knowledge grounding, consistency modeling, and uncertainty estimation.

Authors

Sources

Referenced by nodes (3)