reference
The paper 'Repeat after me: transformers are better than state space models at copying' is an arXiv preprint, identified as arXiv:2402.01032.
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (1)
- Transformers concept