Heinonen, Markus
Non-Stationary Spectral Kernels
Remes, Sami, Heinonen, Markus, Kaski, Samuel
We propose non-stationary spectral kernels for Gaussian process regression by modelling the spectral density of a non-stationary kernel function as a mixture of input-dependent Gaussian process frequency density surfaces. We solve the generalised Fouriertransform with such a model, and present a family of non-stationary and nonmonotonic kernels that can learn input-dependent and potentially longrange, non-monotoniccovariances between inputs. We derive efficient inference using model whitening and marginalized posterior, and show with case studies that these kernels are necessary when modelling even rather simple time series, image or geospatial data with non-stationary characteristics.
A Mutually-Dependent Hadamard Kernel for Modelling Latent Variable Couplings
Remes, Sami, Heinonen, Markus, Kaski, Samuel
We introduce a novel kernel that models input-dependent couplings across multiple latent processes. The pairwise joint kernel measures covariance along inputs and across different latent signals in a mutually-dependent fashion. A latent correlation Gaussian process (LCGP) model combines these non-stationary latent components into multiple outputs by an input-dependent mixing matrix. Probit classification and support for multiple observation sets are derived by Variational Bayesian inference. Results on several datasets indicate that the LCGP model can recover the correlations between latent signals while simultaneously achieving state-of-the-art performance. We highlight the latent covariances with an EEG classification dataset where latent brain processes and their couplings simultaneously emerge from the model.
Non-Stationary Spectral Kernels
Remes, Sami, Heinonen, Markus, Kaski, Samuel
We propose non-stationary spectral kernels for Gaussian process regression. We propose to model the spectral density of a non-stationary kernel function as a mixture of input-dependent Gaussian process frequency density surfaces. We solve the generalised Fourier transform with such a model, and present a family of non-stationary and non-monotonic kernels that can learn input-dependent and potentially long-range, non-monotonic covariances between inputs. We derive efficient inference using model whitening and marginalized posterior, and show with case studies that these kernels are necessary when modelling even rather simple time series, image or geospatial data with non-stationary characteristics.
Random Fourier Features for Operator-Valued Kernels
Brault, Romain, d'Alché-Buc, Florence, Heinonen, Markus
Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner's theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.
Non-Stationary Gaussian Process Regression with Hamiltonian Monte Carlo
Heinonen, Markus, Mannerström, Henrik, Rousu, Juho, Kaski, Samuel, Lähdesmäki, Harri
We present a novel approach for fully non-stationary Gaussian process regression (GPR), where all three key parameters -- noise variance, signal variance and lengthscale -- can be simultaneously input-dependent. We develop gradient-based inference methods to learn the unknown function and the non-stationary model parameters, without requiring any model approximations. We propose to infer full parameter posterior with Hamiltonian Monte Carlo (HMC), which conveniently extends the analytical gradient-based GPR learning by guiding the sampling with model gradients. We also learn the MAP solution from the posterior by gradient ascent. In experiments on several synthetic datasets and in modelling of temporal gene expression, the nonstationary GPR is shown to be necessary for modeling realistic input-dependent dynamics, while it performs comparably to conventional stationary or previous non-stationary GPR models otherwise.
Learning nonparametric differential equations with operator-valued kernels and gradient matching
Heinonen, Markus, d'Alché-Buc, Florence
Modeling dynamical systems with ordinary differential equations implies a mechanistic view of the process underlying the dynamics. However in many cases, this knowledge is not available. To overcome this issue, we introduce a general framework for nonparametric ODE models using penalized regression in Reproducing Kernel Hilbert Spaces (RKHS) based on operator-valued kernels. Moreover, we extend the scope of gradient matching approaches to nonparametric ODE. A smooth estimate of the solution ODE is built to provide an approximation of the derivative of the ODE solution which is in turn used to learn the nonparametric ODE model. This approach benefits from the flexibility of penalized regression in RKHS allowing for ridge or (structured) sparse regression as well. Very good results are shown on 3 different ODE systems.