claim
Relative positional encodings introduce a distance attenuation effect that competes and balances with the deviation caused by the causal mask in multi-layer Transformers, according to research by Wu et al. (2025c).
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (1)
- Transformers concept