Generalising the Interaction Rules in Probabilistic Logic

AAAI Conferences

The last two decades has seen the emergence of many different probabilistic logics that use logical languages to specify, and sometimes reason, with probability distributions. Probabilistic logics that support reasoning with probability distributions, such as ProbLog, use an implicit definition of an interaction rule to combine probabilistic evidence about atoms. In this paper, we show that this interaction rule is an example of a more general class of interactions that can be described by non-monotonic logics. We furthermore show that such local interactions about the probability of an atom can be described by convolution. The resulting extended probabilistic logic supports non-monotonic reasoning with probabilistic information.

Generalizing Fuzzy Logic Probabilistic Inferences Artificial Intelligence

Linear representations for a subclass of boolean symmetric functions selected by a parity condition are shown to constitute a generalization of the linear constraints on probabilities introduced by Boole. These linear constraints are necessary to compute probabilities of events with relations between the. arbitrarily specified with propositional calculus boolean formulas.


AAAI Conferences

Modal Markov Logic for a single agent has previously been proposed as an extension to propositional Markov logic. While the framework allowed reasoning under the principle of maximum entropy for various modal logics, it is not feasible to apply its counting based inference to reason about the beliefs and knowledge of multiple agents due to magnitude of the numbers involved. We propose a modal extension of propositional Markov logicthat avoids this problem by coarsening the state space.The problem stems from the fact that in the single-agent setting, the state space is only doubly exponential in the number of propositions in the domain, but the state space can potentially become infinite in the multi-agent setting. In addition, the proposed framework adds only the overhead of deciding satisfiability for the chosen modal logic on the top of the complexity of exact inference in propositional Markov logic. The proposed framework allows one to find a distribution that matches probabilities of formulas obtained from training data (or provided by an expert). Finally, we show how one can compute lower and upper bounds on probabilities of arbitrary formulas.

On the Parameterized Complexity of Default Logic and Autoepistemic Logic Artificial Intelligence

We investigate the application of Courcelle's Theorem and the logspace version of Elberfeld etal. in the context of the implication problem for propositional sets of formulae, the extension existence problem for default logic, as well as the expansion existence problem for autoepistemic logic and obtain fixed-parameter time and space efficient algorithms for these problems. On the other hand, we exhibit, for each of the above problems, families of instances of a very simple structure that, for a wide range of different parameterizations, do not have efficient fixed-parameter algorithms (even in the sense of the large class XPnu), unless P=NP.

Multi-valued logics

AAAI Conferences

There has been increasing interest in AI generally in inference methods which are extensions of the description provided by first order logic. Circumscription [9], default logic [lo] and probabilistic inference schemes such as that discussed in [7] are examples. Research in truth maintenance systems [4] has involved recording information concerning not only the truth or falsity of a given conclusion, but also justifications for that truth or falsity. This is useful in providing explanations, and also in the revision of inferences drawn using nonmonotonic inference rules. Assumption-based truth maintenance systems [3] provide an interesting extension of this idea, taking the truth value of a given proposition to be the set of contexts in which it will hold.