Relations (1)

related 2.58 — strongly supporting 5 facts

Large Language Models and graph neural networks are related through their integration in neuro-symbolic AI [1] and their combined use to enhance graph-structured data modeling [2]. This synergy is further evidenced by specific methodologies like 'Graph neural prompting' [3], the GraphLLM framework [4], and various architectural integration strategies [5].

Facts (5)

Sources
Large Language Models Meet Knowledge Graphs for Question ... arxiv.org arXiv 1 fact
referenceGraphLLM leverages large language models to decompose multi-hop questions into sub-questions and retrieves sub-graphs via graph neural networks and large language models to generate answers based on graph reasoning.
Grounding LLM Reasoning with Knowledge Graphs - arXiv arxiv.org arXiv 1 fact
procedureThere are four primary methods for integrating Knowledge Graphs with Large Language Models: (1) learning graph representations, (2) using Graph Neural Network (GNN) retrievers to extract entities as text input, (3) generating code like SPARQL queries to retrieve information, and (4) using step-by-step interaction methods for iterative reasoning.
The Synergy of Symbolic and Connectionist AI in LLM ... arxiv.org arXiv 1 fact
claimThe integration of graph neural networks with rule-based reasoning positioned knowledge graphs at the core of the neuro-symbolic AI approach prior to the surge of Large Language Models (LLMs).
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org Frontiers 1 fact
referenceTian et al. (2024) proposed 'Graph neural prompting', a method for using large language models with graph neural networks.
KG-IRAG: A Knowledge Graph-Based Iterative Retrieval-Augmented ... arxiv.org arXiv 1 fact
claimThe combination of Large Language Models with Graph Neural Networks (GNNs) significantly improves the modeling capabilities of graph-structured data.