Not enough data to create a plot.
Try a different view from the menu above.
Welling, Max
Linear Response for Approximate Inference
Welling, Max, Teh, Yee W.
Learning Sparse Topographic Representations with Products of Student-t Distributions
Welling, Max, Osindero, Simon, Hinton, Geoffrey E.
We propose a model for natural images in which the probability of an image is proportional to the product of the probabilities of some filter outputs. We encourage the system to find sparse features by using a Studentt distribution to model each filter output. If the t-distribution is used to model the combined outputs of sets of neurally adjacent filters, the system learns a topographic map in which the orientation, spatial frequency and location of the filters change smoothly across the map. Even though maximum likelihood learning is intractable in our model, the product form allows a relatively efficient learning procedure that works well even for highly overcomplete sets of filters. Once the model has been learned it can be used as a prior to derive the "iterated Wiener filter" for the purpose of denoising images.
Self Supervised Boosting
Welling, Max, Zemel, Richard S., Hinton, Geoffrey E.
Boosting algorithms and successful applications thereof abound for classification andregression learning problems, but not for unsupervised learning. We propose a sequential approach to adding features to a random fieldmodel by training them to improve classification performance between the data and an equal-sized sample of "negative examples" generated fromthe model's current estimate of the data density.
The Unified Propagation and Scaling Algorithm
Teh, Yee W., Welling, Max
In this paper we will show that a restricted class of constrained minimum divergence problems, named generalized inference problems, can be solved by approximating the KL divergence with a Bethe free energy. The algorithm we derive is closely related to both loopy belief propagation and iterative scaling. This unified propagation and scaling algorithm reduces to a convergent alternative to loopy belief propagation when no constraints are present. Experiments show the viability of our algorithm.
The Unified Propagation and Scaling Algorithm
Teh, Yee W., Welling, Max
In this paper we will show that a restricted class of constrained minimum divergenceproblems, named generalized inference problems, can be solved by approximating the KL divergence with a Bethe free energy. The algorithm we derive is closely related to both loopy belief propagation anditerative scaling. This unified propagation and scaling algorithm reduces to a convergent alternative to loopy belief propagation when no constraints are present. Experiments show the viability of our algorithm.