reference
The paper 'Rnns are not transformers (yet): the key bottleneck on in-context retrieval' is an arXiv preprint (arXiv:2402.18510).
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (2)
- arXiv entity
- Transformers concept