reference
The paper 'Self-attention networks can process bounded hierarchical languages' is an arXiv preprint (arXiv:2105.11115) that demonstrates the capability of self-attention networks to process bounded hierarchical languages.
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (1)
- arXiv entity