Goto

Collaborating Authors

 oom oom oom


Supplementary Materials A Constraint Explanation for Problem 2 Constraint 2b checks the loss of each sample and the derivation is shown as follows

Neural Information Processing Systems

Constraint 2b checks the loss of each sample and the derivation is shown as follows. Constraint 2d to 2i are mainly adopted from Bertsimas and Dunn [2019] on Chapter 8.2. Here we briefly explain the meaning and derivations of these constraints. Constraint 2g and 2h are set to ensure that if there's no split on Constraint 2i enforces the hierarchical structure of the tree. Algorithm 1 depicts the details of the Branch-and-bound scheme for training the optimal decision tree.


Searching for the M Best Solutions in Graphical Models

Flerova, Natalia, Marinescu, Radu, Dechter, Rina

Journal of Artificial Intelligence Research

The paper focuses on finding the m best solutions to combinatorial optimization problems using best-first or depth-first branch and bound search. Specifically, we present a new algorithm m-A*, extending the well-known A* to the m-best task, and for the first time prove that all its desirable properties, including soundness, completeness and optimal efficiency, are maintained. Since best-first algorithms require extensive memory, we also extend the memory-efficient depth-first branch and bound to the m-best task. We adapt both algorithms to optimization tasks over graphical models (e.g., Weighted CSP and MPE in Bayesian networks), provide complexity analysis and an empirical evaluation. Our experiments confirm theory that the best-first approach is largely superior when memory is available, but depth-first branch and bound is more robust. We also show that our algorithms are competitive with related schemes recently developed for the m-best task.