Relations (1)

related 1.58 — strongly supporting 2 facts

Knowledge graphs are related to pre-training because they are integrated into Large Language Models during the pre-training phase to improve knowledge expression, as described in [1] and [2].

Facts (2)

Sources
A survey on augmenting knowledge graphs (KGs) with large ... link.springer.com Springer 1 fact
claimPre-training methods for KG-enhanced LLMs incorporate knowledge graphs during the LLM training phase to enhance knowledge expression.
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org Frontiers 1 fact
referenceThe integration of Knowledge Graphs into Large Language Models can be categorized into three types based on the effect of the enhancement: pre-training, reasoning methods (including supervised fine-tuning and alignment fine-tuning), and model interpretability.