Mixture Density Estimation

Li, Jonathan Q., Barron, Andrew R.

Neural Information Processing Systems 

Gaussian mixtures (or so-called radial basis function networks) for density estimation provide a natural counterpart to sigmoidal neural networksfor function fitting and approximation. In both cases, it is possible to give simple expressions for the iterative improvement ofperformance as components of the network are introduced one at a time. In particular, for mixture density estimation we show that a k-component mixture estimated by maximum likelihood (or by an iterative likelihood improvement that we introduce) achieves log-likelihood within order 1/k of the log-likelihood achievable by any convex combination. Consequences for approximation and estimation usingKullback-Leibler risk are also given. A Minimum Description Length principle selects the optimal number of components kthat minimizes the risk bound. 1 Introduction In density estimation, Gaussian mixtures provide flexible-basis representations for densities that can be used to model heterogeneous data in high dimensions.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found