Relations (1)
related 2.58 — strongly supporting 5 facts
Knowledge graphs and chain-of-thought are integrated as complementary techniques in LLM reasoning frameworks, as evidenced by the development of methods like KG-CoT {fact:2, fact:5} and AprèsCoT {fact:3, fact:4}, which combine structured graph retrieval with sequential reasoning steps [1].
Facts (5)
Sources
Large Language Models Meet Knowledge Graphs for Question ... arxiv.org 2 facts
claimRuilin Zhao, Feng Zhao, Long Wang, Xianzhi Wang, and Guandong Xu published the paper 'KG-CoT: Chain-of-thought prompting of large language models over knowledge graphs for knowledge-aware question answering' in 2024.
referenceShirdel et al. (2025) published 'AprèsCoT: Explaining LLM answers with knowledge graphs and chain of thought' in EDBT, pages 1142–1145, introducing a method for explaining LLM outputs using knowledge graphs and chain-of-thought reasoning.
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org 1 fact
referenceThe paper 'Kg-cot: chain-of-thought prompting of large language models over knowledge graphs for knowledge-aware question answering' was published in the Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence (IJCAI-24) in 2024.
LLM-KG4QA: Large Language Models and Knowledge Graphs for ... github.com 1 fact
referenceAprèsCoT is a system that explains Large Language Model answers by utilizing knowledge graphs and Chain of Thought reasoning.
Grounding LLM Reasoning with Knowledge Graphs - arXiv arxiv.org 1 fact
procedureThe framework for grounding LLM reasoning in knowledge graphs integrates each reasoning step with structured graph retrieval and combines strategies like Chain of Thought (CoT), Tree of Thoughts (ToT), and Graph of Thoughts (GoT) with adaptive graph search.