7bb060764a818184ebb1cc0d43d382aa-Reviews.html

Neural Information Processing Systems 

The paper provides a theoretical analysis of the types of distributions that can efficiently be represented by restricted Boltzmann machines (RBMs). The analysis is based on a representation of the unnormalized log probability (free energy) of RBMs as a special form of neural network (NN). The paper relates these RBM networks to more common types of NNs whose properties have been studied in the literature. This approach allows the authors to identify two non-trivial examples of functions that can and cannot be represented efficiently by RBM networks - and hence related distributions can / cannot be modeled efficiently by RBMs. Specifically they show that RBM networks can efficiently represent any function that only depends on the number of non-zero visible units, such as parity, but that they are unable to represent the only somewhat more difficult example of inner product parity.