claim
The research paper 'Retentive network: a successor to transformer for large language models' (arXiv:2307.08621) proposes the Retentive Network as an alternative architecture to the Transformer for large language models.
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (2)
- Large Language Models concept
- Transformer concept