reference
Knowledge-Aware Inference in LLMs involves retrieving structured triples from Knowledge Graphs, reasoning over graph paths, and generating outputs constrained by symbolic relationships, which boosts multi-hop reasoning and factual QA performance without retraining large models.

Authors

Sources

Referenced by nodes (2)