Relations (1)
related 2.32 — strongly supporting 4 facts
Knowledge graphs and pre-trained language models are integrated in various research frameworks, such as GenKGC [1], BERTRL [2], RELMKG [3], and Bertnet [4], which utilize language models to perform tasks like knowledge graph completion, reasoning, and extraction.
Facts (4)
Sources
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org 4 facts
referenceCao and Liu (2023) proposed RELMKG, a method for reasoning with pre-trained language models and knowledge graphs for complex question answering, published in Applied Intelligence.
referenceThe GenKGC model (Xie et al., 2022) leverages pre-trained language models to convert the knowledge graph completion task into a sequence-to-sequence generation task.
referenceHao et al. (2022) introduced 'Bertnet', a system for harvesting knowledge graphs with arbitrary relations from pre-trained language models.
referenceBERTRL, proposed by Zha et al. in 2022, leverages pre-trained language models and fine-tunes them using relation instances and reasoning paths as training samples.