reference
The paper 'Transformer working memory enables regular language reasoning and natural language length extrapolation' by Ta-Chung Chi, Ting-Han Fan, Alexander Rudnicky, and Peter Ramadge was published in the 'Findings of the Association for Computational Linguistics: EMNLP 2023'.
Authors
Sources
- A Survey of Incorporating Psychological Theories in LLMs - arXiv arxiv.org via serper