claim
Bini et al. (2024) utilize orthogonal transformations for fine-tuning as an alternative to LoRA.
Authors
Sources
- A Survey on the Theory and Mechanism of Large Language Models arxiv.org via serper
Referenced by nodes (1)
- fine-tuning concept
Bini et al. (2024) utilize orthogonal transformations for fine-tuning as an alternative to LoRA.