reference
Knowledge distillation involves training a smaller student model to emulate the internal representations of a complex teacher model to deploy machine learning in resource-constrained environments.
Authors
Sources
- Track: Poster Session 3 - aistats 2026 virtual.aistats.org via serper
Referenced by nodes (1)
- machine learning concept