reference
Knowledge distillation involves training a smaller student model to emulate the internal representations of a complex teacher model to deploy machine learning in resource-constrained environments.

Authors

Sources

Referenced by nodes (1)