Suppose that multiple experts (or learning algorithms) provide us with alternative Bayesian network (BN) structures over a domain, and that we are interested in combining them into a single consensus BN structure. Specifically, we are interested in that the consensus BN structure only represents independences all the given BN structures agree upon and that it has as few parameters associated as possible. In this paper, we prove that there may exist several non-equivalent consensus BN structures and that finding one of them is NP-hard. Thus, we decide to resort to heuristics to find an approximated consensus BN structure. In this paper, we consider the heuristic proposed by Matzkevich and Abramson, which builds upon two algorithms, called Methods A and B, for efficiently deriving the minimal directed independence map of a BN structure relative to a given node ordering. Methods A and B are claimed to be correct although no proof is provided (a proof is just sketched). In this paper, we show that Methods A and B are not correct and propose a correction of them.

Fahimi, Hamed (Université Laval) | Quimper, Claude-Guy (Université Laval)

We present three new filtering algorithms for the Disjunctive constraint that all have a linear running time complexity in the number of tasks. The first algorithm filters the tasks according to the rules of the time tabling. The second algorithm performs an overload check that could also be used for the Cumulative constraint. The third algorithm enforces the rules of detectable precedences. The two last algorithms use a new data structure that we introduce and that we call the time line. This data structure provides many constant time operations that were previously implemented in logarithmic time by the Theta-tree data structure. Experiments show that these new algorithms are competitive even for a small number of tasks and outperform existing algorithms as the number of tasks increases.

Shehu, Amarda (George Mason University) | Cortés, Juan (LAAS-CNRS) | Cheng, Jianlin (University of Missouri)

The AAAI-13 workshop Artificial Intelligence and Robotics Methods in Computational Biology provided a forum for AI and Robotics researchers with diverse backgrounds in search, planning, machine learning, data mining, evolutionary computation, and so on, to exchange views, treatments, and findings on important open problems relating to biomolecular structure prediction and design, motion simulation, and assembly/docking prediction. A total of eight papers were accepted.

Bachrach, Yoram (Microsoft Research Cambridge) | Kohli, Pushmeet (Microsoft Research Cambridge) | Kolmogorov, Vladimir (nstitute of Science and Technology) | Zadimoghaddam, Morteza (Massachusetts Institute of Technology)

Representation languages for coalitional games are a key research area in algorithmic game theory. There is an inherent tradeoff between how general a language is, allowing it to capture more elaborate games, and how hard it is computationally to optimize and solve such games. One prominent such language is the simple yet expressive Weighted Graph Games (WGGs) representation (Deng and Papadimitriou, 1994), which maintains knowledge about synergies between agents in the form of an edge weighted graph. We consider the problem of finding the optimal coalition structure in WGGs. The agents in such games are vertices in a graph, and the value of a coalition is the sum of the weights of the edges present between coalition members. The optimal coalition structure is a partition of the agents to coalitions, that maximizes the sum of utilities obtained by the coalitions. We show that finding the optimal coalition structure is not only hard for general graphs, but is also intractable for restricted families such as planar graphs which are amenable for many other combinatorial problems. We then provide algorithms with constant factor approximations for planar, minor-free and bounded degree graphs.

In this presentation, Mate plans to talk about the base of all NP-complete problems, the satisfiability problem, how these problems are being solved with state-of-the-art SAT solvers, and how that is relevant to our everyday lives. SAT solvers have enjoyed a boom thanks to massive improvements through the use of lazy data structures, tricky graph algorithms, proof verification, and heuristics. This allowed SAT solvers to thrive in many different tasks such as hardware verification,software fuzzing, cryptography, and math proofs. In this talk we will examine some of the core data structures and algorithms used by SAT solvers, as well as show some example use-cases for them in different fields.