Goto

Collaborating Authors

 University of Freiburg


Multiobjective Tree-Structured Parzen Estimator

Journal of Artificial Intelligence Research

Practitioners often encounter challenging real-world problems that involve a simultaneous optimization of multiple objectives in a complex search space. To address these problems, we propose a practical multiobjective Bayesian optimization algorithm. It is an extension of the widely used Tree-structured Parzen Estimator (TPE) algorithm, called Multiobjective Tree-structured Parzen Estimator (MOTPE). We demonstrate that MOTPE approximates the Pareto fronts of a variety of benchmark problems and a convolutional neural network design problem better than existing methods through the numerical results. We also investigate how the configuration of MOTPE affects the behavior and the performance of the method and the effectiveness of asynchronous parallelization of the method based on the empirical results.


Trial-Based Heuristic Tree Search for MDPs with Factored Action Spaces

AAAI Conferences

MDPs with factored action spaces, i.e., where actions are described as assignments to a set of action variables, allow reasoning over action variables instead of action states, yet most algorithms only consider a grounded action representation. This includes algorithms that are instantiations of the Trial-based Heuristic Tree Search (THTS) framework, such as AO* or UCT. To be able to reason over factored action spaces, we propose a generalization of THTS where nodes that branch over all applicable actions are replaced with subtrees that consist of nodes that represent the decision for a single action variable. We show that many THTS algorithms retain their theoretical properties under the generalised framework, and show how to approximate any state-action heuristic to a heuristic for partial action assignments. This allows to guide a UCT variant that is able to create exponentially fewer nodes than the same algorithm that considers ground actions. An empirical evaluation on the benchmark set of the probabilistic track of the latest International Planning Competition validates the benefits of the approach.


Symbolic Planning with Edge-Valued Multi-Valued Decision Diagrams

AAAI Conferences

Symbolic representations have attracted significant attention in optimal planning. Binary Decision Diagrams (BDDs) form the basis for symbolic search algorithms. Closely related are Algebraic Decision Diagrams (ADDs), used to represent heuristic functions. Also, progress was made in dealing with models that take state-dependent action costs into account. Here, costs are represented as Edge-valued Multi-valued Decision Diagrams (EVMDDs), which can be exponentially more compact than the corresponding ADD representation. However, they were not yet considered for symbolic planning. In this work, we study EVMDD-based symbolic search for optimal planning. We define EVMDD-based representations of state sets and transition relations, and show how to compute the necessary operations required for EVMDD-A*. This EVMDD-based version of symbolic A* generalizes its BDD variant, and allows to solve planning tasks with state-dependent action costs. We prove theoretically that our approach is sound, complete and optimal. Additionally, we present an empirical analysis comparing EVMDD-A* to BDD-A* and explicit A* search. Our results underscore the usefulness of symbolic approaches and the feasibility of dealing with models that go beyond unit costs.


Warmstarting of Model-Based Algorithm Configuration

AAAI Conferences

The performance of many hard combinatorial problem solvers depends strongly on their parameter settings, and since manual parameter tuning is both tedious and suboptimal the AI community has recently developed several algorithm configuration (AC) methods to automatically address this problem. While all existing AC methods start the configuration process of an algorithm A from scratch for each new type of benchmark instances, here we propose to exploit information about A's performance on previous benchmarks in order to warmstart its configuration on new types of benchmarks. We introduce two complementary ways in which we can exploit this information to warmstart AC methods based on a predictive model. Experiments for optimizing a flexible modern SAT solver on twelve different instance sets show that our methods often yield substantial speedups over existing AC methods (up to 165-fold) and can also find substantially better configurations given the same compute budget.


On the Relationship Between State-Dependent Action Costs and Conditional Effects in Planning

AAAI Conferences

When planning for tasks that feature both state-dependent action costs and conditional effects using relaxation heuristics, the following problem appears: handling costs and effects separately leads to worse-than-necessary heuristic values, since we may get the more useful effect at the lower cost by choosing different values of a relaxed variable when determining relaxed costs and relaxed active effects. In this paper, we show how this issue can be avoided by representing state-dependent costs and conditional effects uniformly, both as edge-valued multi-valued decision diagrams (EVMDDs) over different sets of edge values, and then working with their product diagram. We develop a theory of EVMDDs that is general enough to encompass state-dependent action costs, conditional effects, and even their combination.We define relaxed effect semantics in the presence of state-dependent action costs and conditional effects, and describe how this semantics can be efficiently computed using product EVMDDs. This will form the foundation for informative relaxation heuristics in the setting with state-dependent costs and conditional effects combined.


Rational Inference Patterns Based on Conditional Logic

AAAI Conferences

Conditional information is an integral part of representation and inference processes of causal relationships, temporal events, and even the deliberation about impossible scenarios of cognitive agents. For formalizing these inferences, a proper formal representation is needed. Psychological studies indicate that classical, monotonic logic is not the approriate model for capturing human reasoning: There are cases where the participants systematically deviate from classically valid answers, while in other cases they even endorse logically invalid ones. Many analyses covered the independent analysis of individual inference rules applied by human reasoners. In this paper we define inference patterns as a formalization of the joint usage or avoidance of these rules. Considering patterns instead of single inferences opens the way for categorizing inference studies with regard to their qualitative results. We apply plausibility relations which provide basic formal models for many theories of conditionals, nonmonotonic reasoning, and belief revision to asses the rationality of the patterns and thus the individual inferences drawn in the study. By this replacement of classical logic with formalisms most suitable for conditionals, we shift the basis of judging rationality from compatibility with classical entailment to consistency in a logic of conditionals. Using inductive reasoning on the plausibility relations we reverse engineer conditional knowledge bases as explanatory model for and formalization of the background knowledge of the participants. In this way the conditional knowledge bases derived from the inference patterns provide an explanation for the outcome of the study that generated the inference pattern.


Efficient Parameter Importance Analysis via Ablation with Surrogates

AAAI Conferences

To achieve peak performance, it is often necessary to adjust the parameters of a given algorithm to the class of problem instances to be solved; this is known to be the case for popular solvers for a broad range of AI problems, including AI planning, propositional satisfiability (SAT) and answer set programming (ASP). To avoid tedious and often highly sub-optimal manual tuning of such parameters by means of ad-hoc methods, general-purpose algorithm configuration procedures can be used to automatically find performance-optimizing parameter settings. While impressive performance gains are often achieved in this manner, additional, potentially costly parameter importance analysis is required to gain insights into what parameter changes are most responsible for those improvements. Here, we show how the running time cost of ablation analysis, a well-known general-purpose approach for assessing parameter importance, can be reduced substantially by using regression models of algorithm performance constructed from data collected during the configuration process. In our experiments, we demonstrate speed-up factors between 33 and 14 727 for ablation analysis on various configuration scenarios from AI planning, SAT, ASP and mixed integer programming (MIP).


Sleep Sets Meet Duplicate Elimination

AAAI Conferences

The sleep sets technique is a path-dependent pruning method for state space search. In the past, the combination of sleep sets with graph search algorithms that perform duplicate elimination has often shown to be error-prone. In this paper, we provide the theoretical basis for the integration of sleep sets with common search algorithms in AI that perform duplicate elimination. Specifically, we investigate approaches to safely integrate sleep sets with optimal (best-first) search algorithms. Based on this theory, we provide an initial step towards integrating sleep sets within A* and additional state pruning techniques like strong stubborn sets. Our experiments show slight, yet consistent improvements on the number of generated search nodes across a large number of standard domains from the international planning competitions.


Clauses Versus Gates in CEGAR-Based 2QBF Solving

AAAI Conferences

2QBF is a special case of general quantified Boolean formulae (QBF). It is limited to just two quantification levels, i.e., to a form forall-exists. Despite this limitation it applies to a wide range of applications, e.g., to artificial intelligence, graph theory, synthesis, etc.. Recent research showed that CEGAR-based methods give a performance boost to QBF solving (e.g, compared to QDPLL). Conjunctive normal form (CNF) is a commonly accepted representation for both SAT and QBF problems; however, it does not reflect the circuit structure that might be present in the problem. Existing attempts of extracting this structure from CNF and using it in 2QBF context do not show advantages over CNF based 2QBF solvers. In this work we introduce a new workflow for 2QBF, containing a new semantic circuit extraction algorithm and a CEGAR-based 2QBF solver that uses circuit structure and is improved by a so-called "cofactor sharing'' heuristics. We evaluate the proposed methodology on a range of benchmarks and show the practicality of the new approach.


Reports on the 2015 AAAI Workshop Program

AI Magazine

AAAI's 2015 Workshop Program was held Sunday and Monday, January 25–26, 2015 at the Hyatt Regency Austin Hotel in Austion, Texas, USA. The AAAI-15 workshop program included 15 workshops covering a wide range of topics in artificial intelligence. Most workshops were held on a single day. The titles of the workshops included AI and Ethics, AI for Cities, AI for Transportation: Advice, Interactivity and Actor Modeling, Algorithm Configuration, Artificial Intelligence Applied to Assistive Technologies and Smart Environments, Beyond the Turing Test, Computational Sustainability, Computer Poker and Imperfect Information, Incentive and Trust in E-Communities, Multiagent Interaction without Prior Coordination, Planning, Search, and Optimization, Scholarly Big Data: AI Perspectives, Challenges, and Ideas, Trajectory-Based Behaviour Analytics, World Wide Web and Public Health Intelligence, Knowledge, Skill, and Behavior Transfer in Autonomous Robots, and Learning for General Competency in Video Games.