Goto

Collaborating Authors

Description Logics Courses and Tutorials

AITopics Original Links

Enrico Franconi's Course on Description Logics The material includes slides for 6 modules ( 320 slides): A review of Computational Logics, Structural Description Logics, Propositional Description Logics, Description Logics and Knowledge Bases, Description Logics and Logics, Description Logics and Databases. A web pointer to an online modified version of CRACK, allowing for tracing satisfiability proofs with tableaux, is provided. Pointers to relevant online literature are provided, too. Enrico Franconi's Course on Description Logics The material includes slides for 6 modules ( 320 slides): A review of Computational Logics, Structural Description Logics, Propositional Description Logics, Description Logics and Knowledge Bases, Description Logics and Logics, Description Logics and Databases. A web pointer to an online modified version of CRACK, allowing for tracing satisfiability proofs with tableaux, is provided.


Lightweight Description Logics and Branching Time: A Troublesome Marriage

AAAI Conferences

We study branching-time temporal description logics (BTDLs) based on the temporal logic CTL in the presence of rigid (time-invariant) roles and general TBoxes. There is evidence that, if full CTL is combined with the classical ALC in this way, reasoning becomes undecidable. In this paper, we begin by substantiating this claim, establishing undecidability for a fragment of this combination. In view of this negative result, we then investigate BTDLs that emerge from combining fragments of CTL with lightweight DLs from the EL and DL-Lite families. We show that even rather inexpressive BTDLs based on EL exhibit very high complexity. Most notably, we identify two convex fragments which are undecidable and hard for non-elementary time, respectively. For BTDLs based on DL-Lite-bool-N, we obtain tight complexity bounds that range from PSPACE to EXPTIME.


Description Logics and Planning

AI Magazine

This article surveys previous work on combining planning techniques with expressive representations of knowledge in description logics to reason about tasks, plans, and goals. Description logics can reason about the logical definition of a class and automatically infer class-subclass subsumption relations as well as classify instances into classes based on their definitions. Descriptions of actions, plans, and goals can be exploited during plan generation, plan recognition, or plan evaluation. Another emerging use of these techniques is the semantic web, where current ontology languages based on description logics need to be extended to reason about goals and capabilities for web services and agents.


Riguzzi

AAAI Conferences

Modeling real world domains requires ever more frequently to represent uncertain information. The DISPONTE semantics for probabilistic description logics allows to annotate axioms of a knowledge base with a value that represents their probability. In this paper we discuss approaches for performing inference from probabilistic ontologies following the DISPONTE semantics. We present the algorithm BUNDLE for computing the probability of queries. BUNDLE exploits an underlying Description Logic reasoner, such as Pellet, in order to find explanations for a query. These are then encoded in a Binary Decision Diagram that is used for computing the probability of the query.


Cold Posteriors and Aleatoric Uncertainty

arXiv.org Machine Learning

Recent work has observed that one can outperform exact inference in Bayesian neural networks by tuning the "temperature" of the posterior on a validation set (the "cold posterior" effect). To help interpret this phenomenon, we argue that commonly used priors in Bayesian neural networks can significantly overestimate the aleatoric uncertainty in the labels on many classification datasets. This problem is particularly pronounced in academic benchmarks like MNIST or CIFAR, for which the quality of the labels is high. For the special case of Gaussian process regression, any positive temperature corresponds to a valid posterior under a modified prior, and tuning this temperature is directly analogous to empirical Bayes. On classification tasks, there is no direct equivalence between modifying the prior and tuning the temperature, however reducing the temperature can lead to models which better reflect our belief that one gains little information by relabeling existing examples in the training set. Therefore although cold posteriors do not always correspond to an exact inference procedure, we believe they may often better reflect our true prior beliefs.