Goto

Collaborating Authors

 prob


1 Appendix 1 Bayes-by-backprop The Bayesian posterior neural network distribution P (w |D) is approximated

Neural Information Processing Systems

In Algorithm 1 we give the full clustering algorithm used for each of the T fixing iterations. In Figure 1 we show how the layers' In Figure 2 we show the impact of increasing the regularisation strength.



Autoconj: Recognizing and Exploiting Conjugacy Without a Domain-Specific Language

Matthew D. Hoffman

Neural Information Processing Systems

Deriving conditional and marginal distributions using conjugacy relationships can be time consuming and error prone. In this paper, we propose a strategy for automating such derivations. Unlike previous systems which focus on relationships between pairs of random variables, our system (which we call Autoconj) operates directly on Python functions that compute log-joint distribution functions. Autoconj provides support for conjugacy-exploiting algorithms in any Python-embedded PPL. This paves the way for accelerating development of novel inference algorithms and structure-exploiting modeling strategies.








0 10 20 30 Distance to anchor 0. 0 0. 2 0. 4 0. 6 0. 8 1 . 0 Weight

Neural Information Processing Systems

Upper bound Sigmoid (4) γ = 0 .8 We thank all reviewers and ACs. We appreciate the many positive and constructive comments. We will update our paper and supp. Unfortunately, [28] was lost from the citations in L80, which may have made this unclear.