claim
In psychology, attention refers to selective mental focus, whereas in transformer-based LLMs, attention is a token weighting mechanism without cognitive awareness.
Authors
Sources
- A Survey of Incorporating Psychological Theories in LLMs - arXiv arxiv.org via serper
Referenced by nodes (2)
- attention concept
- psychology concept