claim
In psychology, attention refers to selective mental focus, whereas in transformer-based LLMs, attention is a token weighting mechanism without cognitive awareness.

Authors

Sources

Referenced by nodes (2)