Goto

Collaborating Authors

Mistake Bounds for Maximum Entropy Discrimination

Neural Information Processing Systems

The bound is the same as a general bound proved for the Weighted Majority Algorithm, and similar to bounds for other variants of Winnow. We prove a more refined bound that leads to a nearly optimal algorithmfor learning disjunctions, again, based on the maximum entropy principle.


Uncertain Reasoning Using Maximum Entropy Inference

arXiv.org Artificial Intelligence

The use of maximum entropy inference in reasoning with uncertain information is commonly justified by an information-theoretic argument. This paper discusses a possible objection to this information-theoretic justification and shows how it can be met. I then compare maximum entropy inference with certain other currently popular methods for uncertain reasoning. In making such a comparison, one must distinguish between static and dynamic theories of degrees of belief: a static theory concerns the consistency conditions for degrees of belief at a given time; whereas a dynamic theory concerns how one's degrees of belief should change in the light of new information. It is argued that maximum entropy is a dynamic theory and that a complete theory of uncertain reasoning can be gotten by combining maximum entropy inference with probability theory, which is a static theory. This total theory, I argue, is much better grounded than are other theories of uncertain reasoning.


Maximum Uncertainty Procedures for Interval-Valued Probability Distributions

arXiv.org Artificial Intelligence

Measures of uncertainty and divergence are introduced for interval-valued probability distributions and are shown to have desirable mathematical properties. A maximum uncertainty inference procedure for marginal interval distributions is presented. A technique for reconstruction of interval distributions from projections is developed based on this inference procedure


Correcting sample selection bias in maximum entropy density estimation

Neural Information Processing Systems

We study the problem of maximum entropy density estimation in the presence of known sample selection bias. We propose three bias correction approaches.The first one takes advantage of unbiased sufficient statistics which can be obtained from biased samples. The second one estimates thebiased distribution and then factors the bias out. The third one approximates the second by only using samples from the sampling distribution. Weprovide guarantees for the first two approaches and evaluate the performance of all three approaches in synthetic experiments and on real data from species habitat modeling, where maxent has been successfully appliedand where sample selection bias is a significant problem.


Inferring Hierarchical Clustering Structures by Deterministic Annealing

AAAI Conferences

The unsupervised detection of hierarchical structures is a major topic in unsupervised learning and one of the key questions in data analysis and representation. We propose a novel algorithm for the problem of learning decision trees for data clustering and related problems. In contrast to many other methods based on successive tree growing and pruning, we propose an,aL"G.,P Applying the principles of maximum entropy and minimum cross entropy, a deterministic annealing algorithm is derived in a meanfield approximation. This technique allows us to canonically superimpose tree structures and to fit parameters to averaged or'fuzzified' trees.