reference
The paper 'In-context learning with transformers: softmax attention adapts to function lipschitzness' is an arXiv preprint (arXiv:2402.11639) regarding in-context learning.
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (3)
- Transformers concept
- In-Context Learning concept
- ArXiv concept