claim
Multi-task representation learning outperforms single-task representation learning in scenarios involving over-parameterized two-layer convolutional neural networks trained by gradient descent.
Authors
Sources
- Track: Poster Session 3 - aistats 2026 virtual.aistats.org via serper
Referenced by nodes (3)
- gradient descent concept
- multi-task learning concept
- Convolutional Neural Networks concept