Goto

Collaborating Authors

 partition bge




A Supplementary Material

Neural Information Processing Systems

Figure A.1: The median difference in GP log score between the forward and backward model, with Figure A.3 shows the distribution of Cyclic graphs occasionally returned by DiBS+ were discarded. We performed an additional experiment comparing the ability of the different methods to model the posterior distribution over DAGs as a function of their run-time. Figure A.4 shows the reverse K-L divergence between the "true" posterior (obtained by enumerating every possible structure and Figure A.4: Reverse K-L divergence between the true posterior and the BGe posterior (green), DiBS+ In figure A.5 we compare the number of score evaluations performed by the different methods when Figure A.5: Distribution of number of scores evaluated by the different methods. Figure A.9 shows the corresponding run-times needed to run


A Bayesian Take on Gaussian Process Networks

Giudice, Enrico, Kuipers, Jack, Moffa, Giusi

arXiv.org Machine Learning

Gaussian Process Networks (GPNs) are a class of directed graphical models which employ Gaussian processes as priors for the conditional expectation of each variable given its parents in the network. The model allows the description of continuous joint distributions in a compact but flexible manner with minimal parametric assumptions on the dependencies between variables. Bayesian structure learning of GPNs requires computing the posterior over graphs of the network and is computationally infeasible even in low dimensions. This work implements Monte Carlo and Markov Chain Monte Carlo methods to sample from the posterior distribution of network structures. As such, the approach follows the Bayesian paradigm, comparing models via their marginal likelihood and computing the posterior probability of the GPN features. Simulation studies show that our method outperforms state-of-the-art algorithms in recovering the graphical structure of the network and provides an accurate approximation of its posterior distribution.