Relative Density Nets: A New Way to Combine Backpropagation with HMM's

Brown, Andrew D., Hinton, Geoffrey E.

Neural Information Processing Systems 

Hinton Gatsby Unit, UCL London, UK WCIN 3AR hinton@gatsby.ucl.ac.uk Abstract Logistic units in the first hidden layer of a feedforward neural network computethe relative probability of a data point under two Gaussians. This leads us to consider substituting other density models. We present an architecture for performing discriminative learning of Hidden Markov Models using a network of many small HMM's. Experiments on speech data show it to be superior to the standard method of discriminatively training HMM's. 1 Introduction A standard way of performing classification using a generative model is to divide the training cases into their respective classes and then train a set of class conditional models. This unsupervised approach to classification is appealing for two reasons.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found