reference
LUKE (Language Understanding with Knowledge-based Embeddings) is an extension of BERT that uses an entity-aware self-attention mechanism to treat words and entities as independent tokens, outputting contextualized representations.
Authors
Sources
- Combining Knowledge Graphs and Large Language Models - arXiv arxiv.org via serper
Referenced by nodes (1)
- BERT concept