reference
Huang et al. introduced in-context learning distillation, a method that transfers few-shot learning capabilities from large pre-trained LLMs to smaller models.
Authors
Sources
- Unlocking the Potential of Generative AI through Neuro-Symbolic ... arxiv.org via serper
Referenced by nodes (1)
- Large Language Models concept