On Feynman--Kac training of partial Bayesian neural networks
Zhao, Zheng, Mair, Sebastian, Schön, Thomas B., Sjölund, Jens
Recently, partial Bayesian neural networks (pBNNs), which only consider a subset of the parameters to be stochastic, were shown to perform competitively with full Bayesian neural networks. However, pBNNs are often multi-modal in the latent-variable space and thus challenging to approximate with parametric models. To address this problem, we propose an efficient sampling-based training strategy, wherein the training of a pBNN is formulated as simulating a Feynman--Kac model. We then describe variations of sequential Monte Carlo samplers that allow us to simultaneously estimate the parameters and the latent posterior distribution of this model at a tractable computational cost. We show on various synthetic and real-world datasets that our proposed training scheme outperforms the state of the art in terms of predictive performance.
Oct-30-2023
- Country:
- Europe (0.46)
- Genre:
- Research Report (0.50)
- Industry:
- Education > Educational Setting > Online (0.46)