Leveraging Probabilistic Circuits for Nonparametric Multi-Output Regression

Yu, Zhongjie, Zhu, Mingye, Trapp, Martin, Skryagin, Arseny, Kersting, Kristian

arXiv.org Machine Learning 

DN), thus, limiting their use to moderately sized data sets. To enable posterior inference in GPs on large-scale problems, Inspired by recent advances in the field of expertbased recent work (see e.g. Liu et al. [2020] for a detailed approximations of Gaussian processes (GPs), review) mainly resorts to global approximations to the posterior, we present an expert-based approach to large-scale e.g., using inducing points, or local approximations multi-output regression using single-output GP that aim to distribute the computation of the posterior distribution experts. Employing a deeply structured mixture onto local experts. Unfortunately, most of these of single-output GPs encoded via a probabilistic approaches only focus on single-output regression, i.e., the circuit allows us to capture correlations between dependent variable is univariate, and in the case of local multiple output dimensions accurately. By recursively approximations, do not easily extend to multi-output regression partitioning the covariate space and the output tasks, see Bruinsma et al. [2020] for a detailed space, posterior inference in our model reduces to discussion on recent techniques on multi-output GPs.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found