reference
The paper 'Pre-training under infinite compute' is an arXiv preprint (arXiv:2509.14786) cited in section 4.2.1 of 'A Survey on the Theory and Mechanism of Large Language Models'.
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (1)
- Pre-training concept