Infinite Mixtures of Gaussian Process Experts
Rasmussen, Carl E., Ghahramani, Zoubin
–Neural Information Processing Systems
We present an extension to the Mixture of Experts (ME) model, where the individual experts are Gaussian Process (GP) regression models. Using an input-dependent adaptation of the Dirichlet Process, we implement a gating network for an infinite number of Experts. Inference in this model may be done efficiently using a Markov Chain relying on Gibbs sampling. The model allows the effective covariance function to vary with the inputs, and may handle large datasets - thus potentially overcoming two of the biggest hurdles with GP models.
Neural Information Processing Systems
Dec-31-2002