claim
Mixture-of-expert (MOE) models are machine learning methods that model heterogeneous behavior across data space using an ensemble of learners, making them suitable for dynamic data that exhibit non-stationarity and heavy-tailed errors.
Authors
Sources
- Track: Poster Session 3 - aistats 2026 virtual.aistats.org via serper
Referenced by nodes (1)
- machine learning concept