Relations (1)

related 0.20 — supporting 2 facts

Large Language Models are related to knowledge graph reasoning due to the explainability barriers caused by their probabilistic nature in such tasks [1], and specific methods like 'ChatRule' that leverage LLMs to mine logical rules for knowledge graph reasoning [2].

Facts (2)

Sources
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org Frontiers 2 facts
referenceLuo et al. (2023a) proposed 'ChatRule', a method for mining logical rules with large language models for knowledge graph reasoning, in the preprint 'Chatrule: mining logical rules with large language models for knowledge graph reasoning'.
claimThe probabilistic nature of Large Language Models (LLMs) creates fundamental explainability barriers in knowledge graph reasoning tasks.