Incorporating Unlabelled Data into Bayesian Neural Networks

Sharma, Mrinank, Rainforth, Tom, Teh, Yee Whye, Fortuin, Vincent

arXiv.org Artificial Intelligence 

Conventional Bayesian Neural Networks (BNNs) cannot leverage unlabelled data to improve their predictions. To overcome this limitation, we introduce Self-Supervised Bayesian Neural Networks, which use unlabelled data to learn improved prior predictive distributions by maximising an evidence lower bound during an unsupervised pre-training step. With a novel methodology developed to better understand prior predictive distributions, we then show that self-supervised prior predictives capture image semantics better than conventional BNN priors. In our empirical evaluations, we see that self-supervised BNNs offer the label efficiency of self-supervised methods and the uncertainty estimates of Bayesian methods, particularly outperforming conventional BNNs in low-to-medium data regimes.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found