reference
The paper 'Attention is turing-complete' establishes that the attention mechanism is Turing-complete.
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (1)
- attention mechanism concept