reference
ERNIE (Enhanced Language RepresentatioN with Informative Entities) fuses lexical, syntactic, and knowledge information by stacking a textual T-Encoder (which functions like BERT) with a knowledge K-Encoder to represent word tokens and entities in a unified feature space.
Authors
Sources
- Combining Knowledge Graphs and Large Language Models - arXiv arxiv.org via serper
Referenced by nodes (1)
- BERT concept