Relations (1)

related 2.32 — strongly supporting 4 facts

Question Answering is a primary task for which Language Models are optimized, as evidenced by research integrating knowledge graph reasoning [1], [2], and [3]. Furthermore, frameworks like SHARE are specifically designed to enhance the performance of various Language Models across Question Answering tasks [4].

Facts (4)

Sources
Knowledge Graph Combined with Retrieval-Augmented Generation ... drpress.org Academic Journal of Science and Technology 1 fact
referenceYasunaga et al. introduced QA-GNN, a method for reasoning with language models and knowledge graphs for question answering, in an arXiv preprint in 2021.
Track: Poster Session 3 - aistats 2026 virtual.aistats.org Samuel Tesfazgi, Leonhard Sprandl, Sandra Hirche · AISTATS 1 fact
claimThe Shapley-value Guided Rationale Editor (SHARE) is adaptable for tasks including sentiment analysis, claim verification, and question answering, and can integrate with various language models.
Neuro-Symbolic AI: Explainability, Challenges, and Future Trends arxiv.org arXiv 1 fact
referenceHu et al. (2022b) proposed a method for empowering language models by integrating knowledge graph reasoning for question answering tasks.
Large Language Models Meet Knowledge Graphs for Question ... arxiv.org arXiv 1 fact
procedureXiangrong Zhu, Yuexiang Xie, Yi Liu, Yaliang Li, and Wei Hu (2025) conducted a literature review by retrieving research papers published since 2021 using Google Scholar and PaSa, utilizing search phrases such as 'knowledge graph and language model for question answering' and 'KG and LLM for QA', while extending the search scope for benchmark dataset papers to 2016.