Many real-world problems, including inference in Bayes Nets, can be reduced to #SAT, the problem of counting the number of models of a propositional theory. This has motivated the need for efficient #SAT solvers. Currently, such solvers utilize a modified version of DPLL that employs decomposition and caching, techniques that significantly increase the time it takes to process each node in the search space. In addition, the search space is significantly larger than when solving SAT since we must continue searching even after the first solution has been found. It has previously been demonstrated that the size of a DPLL search tree can be significantly reduced by doing more reasoning at each node. However, for SAT the reductions gained are often not worth the extra time required. In this paper we verify the hypothesis that for #SAT this balance changes. In particular, we show that additional reasoning can reduce the size of a #SAT solver's search space, that this reduction cannot always be achieved by the already utilized technique of clause learning, and that this additional reasoning can be cost effective.

Berkes, Pietro, Wood, Frank, Pillow, Jonathan W.

The coding of information by neural populations depends critically on the statistical dependencies between neuronal responses. However, there is no simple model that combines the observations that (1) marginal distributions over single-neuron spike counts are often approximately Poisson; and (2) joint distributions over the responses of multiple neurons are often strongly dependent. Here, we show that both marginal and joint properties of neural responses can be captured using Poisson copula models. Copulas are joint distributions that allow random variables with arbitrary marginals to be combined while incorporating arbitrary dependencies between them. Different copulas capture different kinds of dependencies, allowing for a richer and more detailed description of dependencies than traditional summary statistics, such as correlation coefficients. We explore a variety of Poisson copula models for joint neural response distributions, and derive an efficient maximum likelihood procedure for estimating them. We apply these models to neuronal data collected in and macaque motor cortex, and quantify the improvement in coding accuracy afforded by incorporating the dependency structure between pairs of neurons.

Even with a compact representation of the dynamics of the entire world, we will rarely want or need to work with the whole model. Given different goals, different time constraints, or different current world states, we might want to take very different views of the world. In this section, we discuss the construction of different world views by specifying only a subset of the possible attributes in the complete world model. In some cases, these abstract views will capture all of the world dynamics relevant to the problem at hand. In other cases, they will serve as tractable approximations to more complex models.

This list is intended to introduce some of the tools of Bayesian statistics and machine learning that can be useful to computational research in cognitive science. The first section mentions several useful general references, and the others provide supplementary readings on specific topics. If you would like to suggest some additions to the list, contact Tom Griffiths.

Dechter, Rina, Mateescu, Robert

The paper introduces mixed networks, a new framework for expressing and reasoning with probabilistic and deterministic information. The framework combines belief networks with constraint networks, defining the semantics and graphical representation. We also introduce the AND/OR search space for graphical models, and develop a new linear space search algorithm. This provides the basis for understanding the benefits of processing the constraint information separately, resulting in the pruning of the search space. When the constraint part is tractable or has a small number of solutions, using the mixed representation can be exponentially more effective than using pure belief networks which odel constraints as conditional probability tables.