Continuous Sigmoidal Belief Networks Trained using Slice Sampling
–Neural Information Processing Systems
These include Boltzmann machines (Hinton and Sejnowski 1986),binary sigmoidal belief networks (Neal 1992) and Helmholtz machines (Hinton et al. 1995; Dayan et al. 1995). However, some hidden variables, such as translation or scaling in images of shapes, are best represented using continuous values.Continuous-valued Boltzmann machines have been developed (Movellan and McClelland 1993), but these suffer from long simulation settling times and the requirement of a "negative phase" during learning. Tibshirani (1992) and Bishop et al. (1996) consider learning mappings from a continuous latent variable space to a higher-dimensional input space. MacKay (1995) has developed "density networks" that can model both continuous and categorical latent spaces using stochasticity at the topmost network layer. In this paper I consider a new hierarchical top-down connectionist model that has stochastic hidden variables at all layers; moreover, these variables can adapt to be continuous or categorical. The proposed top-down model can be viewed as a continuous-valued belief network, whichcan be simulated by performing a quick top-down pass (Pearl 1988).
Neural Information Processing Systems
Dec-31-1997
- Country:
- North America
- Canada > Ontario
- Toronto (0.14)
- United States > California
- San Francisco County > San Francisco (0.14)
- Canada > Ontario
- North America
- Technology: