claim
In-context learning distillation combines in-context learning objectives with traditional language modeling, allowing smaller models to perform effectively with limited data while maintaining computational efficiency.

Authors

Sources

Referenced by nodes (2)