A Bayesian Approach to Invariant Deep Neural Networks
Mourdoukoutas, Nikolaos, Federici, Marco, Pantalos, Georges, van der Wilk, Mark, Fortuin, Vincent
Contributions We propose a method to learn such weight-sharing schemes from data. As a proof of concept, we focus on being invariant We propose a novel Bayesian neural network architecture to two types of transformations applied on images, that can learn invariances from data namely rotations and flips. However, our algorithm can be alone by inferring a posterior distribution over applied to any other choice of symmetry, as long as the corresponding different weight-sharing schemes. We show that weight-sharing scheme is available. Apart from our model outperforms other non-invariant architectures, achieving good performance during inference, our model is when trained on datasets that contain able to learn such invariances from data. This is achieved by specific invariances. The same holds true when specifying a probability distribution over the weight-sharing no data augmentation is performed.
Jul-20-2021
- Country:
- Europe > Switzerland
- North America
- Canada > Ontario
- Toronto (0.14)
- United States > New York (0.14)
- Canada > Ontario
- Genre:
- Research Report (0.40)