Relations (1)
related 0.10 — supporting 1 fact
The concepts are related because CoLAKE utilizes a unified pre-training framework to jointly learn representations of both language and knowledge by integrating them into a shared word-knowledge graph, as described in [1].
Facts (1)
Sources
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org 1 fact
referenceCoLAKE (Sun et al., 2020) uses a unified pre-training framework that jointly learns contextualized representations of language and knowledge by integrating them into a shared structure called the word-knowledge graph.