Sequentially Fitting ``Inclusive'' Trees for Inference in Noisy-OR Networks

Neural Information Processing Systems 

An important class of problems can be cast as inference in noisy(cid:173) OR Bayesian networks, where the binary state of each variable is a logical OR of noisy versions of the states of the variable's par(cid:173) ents. For example, in medical diagnosis, the presence of a symptom can be expressed as a noisy-OR of the diseases that may cause the symptom - on some occasions, a disease may fail to activate the symptom. Inference in richly-connected noisy-OR networks is in(cid:173) tractable, but approximate methods (e .g., variational techniques) are showing increasing promise as practical solutions. One prob(cid:173) lem with most approximations is that they tend to concentrate on a relatively small number of modes in the true posterior, ig(cid:173) noring other plausible configurations of the hidden variables. We introduce a new sequential variational method for bipartite noisy(cid:173) OR networks, that favors including all modes of the true posterior and models the posterior distribution as a tree.