Dynamic Bayesian Neural Networks
Rimella, Lorenzo, Whiteley, Nick
We define an evolving in time Bayesian neural network called a Hidden Markov neural network. The weights of a feed-forward neural network are modelled with the hidden states of a Hidden Markov model, whose observed process is given by the available data. A filtering algorithm is used to learn a variational approximation to the evolving in time posterior over the weights. Training is pursued through a sequential version of Bayes by Backprop Blundell et al. 2015, which is enriched with a stronger regularization technique called variational DropConnect. The experiments test variational DropConnect on MNIST and display the performance of Hidden Markov neural networks on time series.
Jun-24-2020