Nonparametric Bayesian Deep Networks with Local Competition
Panousis, Konstantinos P., Chatzis, Sotirios, Theodoridis, Sergios
Local competition among neighboring neurons is a common procedure taking place in biological systems. This finding has inspired research on more biologically plausible deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition. This paper revisits this modeling paradigm, with the aim of enabling inference of networks that retain state-of-the-art accuracy for the least possible model complexity; this includes the needed number of connections or locally competing sets of units, as well as the required floating-point precision for storing the network weights. To this end, we leverage solid arguments from the field of Bayesian nonparametrics. Specifically, we introduce auxiliary discrete latent variables of model component utility, and perform Bayesian inference over them. Then, we impose appropriate stick-breaking priors over the introduced discrete latent variables; these give rise to a well-established sparsity-inducing mechanism. As we experimentally show using benchmark datasets, our approach yields networks with less memory footprint than the state-of-the-art, and with no compromises in predictive accuracy.
Jan-23-2019