Relations (1)

related 1.58 — strongly supporting 2 facts

Knowledge graphs and BERT are related through hybrid architectures like KG-BERT, which encodes graph triples using BERT [1], and K-BERT, which enhances BERT's performance by injecting domain-specific knowledge from knowledge graphs [2].

Facts (2)

Sources
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org Frontiers 1 fact
referenceThe KG-BERT model (Yao et al., 2019) treats knowledge graph triples as textual sequences and encodes them using BERT-style architectures.
Combining Knowledge Graphs and Large Language Models - arXiv arxiv.org arXiv 1 fact
referenceK-BERT is a joint model that addresses the lack of domain-specific knowledge in BERT by injecting domain knowledge from Knowledge Graphs into sentences.