Not enough data to create a plot.
Try a different view from the menu above.
Country
Design, development and implementation of a tool for construction of declarative functional descriptions of semantic web services based on WSMO methodology
Semantic web services (SWS) are self-contained, self-describing, semantically marked-up software resources that can be published, discovered, composed and executed across the Web in a semi-automatic way. They are a key component of the future Semantic Web, in which networked computer programs become providers and users of information at the same time. This work focuses on developing a full-life-cycle software toolset for creating and maintaining Semantic Web Services (SWSs) based on the Web Service Modelling Ontology (WSMO) framework. A main part of WSMO-based SWS is service capability - a declarative description of Web service functionality. A formal syntax and semantics for such a description is provided by Web Service Modeling Language (WSML), which is based on different logical formalisms, namely, Description Logics, First-Order Logic and Logic Programming. A WSML description of a Web service capability is represented as a set of complex logical expressions (axioms). We develop a specialized user-friendly tool for constructing and editing WSMO-based SWS capabilities. Since the users of this tool are not specialists in first-order logic, a graphical way for constricting and editing axioms is proposed. The designed process for constructing logical expressions is ontology-driven, which abstracts away as much as possible from any concrete syntax of logical language. We propose several mechanisms to guarantees the semantic consistency of the produced logical expressions. The tool is implemented in Java using Eclipse for IDE and GEF (Graphical Editing Framework) for visualization.
Faith in the Algorithm, Part 2: Computational Eudaemonics
Rodriguez, Marko A., Watkins, Jennifer H.
Eudaemonics is the study of the nature, causes, and conditions of human well-being. According to the ethical theory of eudaemonia, reaping satisfaction and fulfillment from life is not only a desirable end, but a moral responsibility. However, in modern society, many individuals struggle to meet this responsibility. Computational mechanisms could better enable individuals to achieve eudaemonia by yielding practical real-world systems that embody algorithms that promote human flourishing. This article presents eudaemonic systems as the evolutionary goal of the present day recommender system.
Safe Reasoning Over Ontologies
Grabarnik, Genady, Kershenbaum, Aaron
As ontologies proliferate and automatic reasoners become more powerful, the problem of protecting sensitive information becomes more serious. In particular, as facts can be inferred from other facts, it becomes increasingly likely that information included in an ontology, while not itself deemed sensitive, may be able to be used to infer other sensitive information. We first consider the problem of testing an ontology for safeness defined as its not being able to be used to derive any sensitive facts using a given collection of inference rules. We then consider the problem of optimizing an ontology based on the criterion of making as much useful information as possible available without revealing any sensitive facts.
On Solving Boolean Multilevel Optimization Problems
Argelich, Josep, Lynce, Ines, Marques-Silva, Joao
Many combinatorial optimization problems entail a number of hierarchically dependent optimization problems. An often used solution is to associate a suitably large cost with each individual optimization problem, such that the solution of the resulting aggregated optimization problem solves the original set of hierarchically dependent optimization problems. This paper starts by studying the package upgradeability problem in software distributions. Straightforward solutions based on Maximum Satisfiability (MaxSAT) and pseudo-Boolean (PB) optimization are shown to be ineffective, and unlikely to scale for large problem instances. Afterwards, the package upgradeability problem is related to multilevel optimization. The paper then develops new algorithms for Boolean Multilevel Optimization (BMO) and highlights a large number of potential applications. The experimental results indicate that the proposed algorithms for BMO allow solving optimization problems that existing MaxSAT and PB solvers would otherwise be unable to solve.
Learning for Dynamic subsumption
Hamadi, Youssef, Jabbour, Said, Sais, Lakhdar
In this paper a new dynamic subsumption technique for Boolean CNF formulae is proposed. It exploits simple and sufficient conditions to detect during conflict analysis, clauses from the original formula that can be reduced by subsumption. During the learnt clause derivation, and at each step of the resolution process, we simply check for backward subsumption between the current resolvent and clauses from the original formula and encoded in the implication graph. Our approach give rise to a strong and dynamic simplification technique that exploits learning to eliminate literals from the original clauses. Experimental results show that the integration of our dynamic subsumption approach within the state-of-the-art SAT solvers Minisat and Rsat achieves interesting improvements particularly on crafted instances.
A Stochastic View of Optimal Regret through Minimax Duality
Abernethy, Jacob, Agarwal, Alekh, Bartlett, Peter L., Rakhlin, Alexander
We study the regret of optimal strategies for online convex optimization games. Using von Neumann's minimax theorem, we show that the optimal regret in this adversarial setting is closely related to the behavior of the empirical minimization algorithm in a stochastic process setting: it is equal to the maximum, over joint distributions of the adversary's action sequence, of the difference between a sum of minimal expected losses and the minimal empirical loss. We show that the optimal regret has a natural geometric interpretation, since it can be viewed as the gap in Jensen's inequality for a concave functional--the minimizer over the player's actions of expected loss--defined on a set of probability distributions. We use this expression to obtain upper and lower bounds on the regret of an optimal strategy for a variety of online learning problems. Our method provides upper bounds without the need to construct a learning algorithm; the lower bounds provide explicit optimal strategies for the adversary.
Heterogeneous knowledge representation using a finite automaton and first order logic: a case study in electromyography
Rialle, Vincent, Vila, Annick, Besnard, Yves
In a certain number of situations, human cognitive functioning is difficult to represent with classical artificial intelligence structures. Such a difficulty arises in the polyneuropathy diagnosis which is based on the spatial distribution, along the nerve fibres, of lesions, together with the synthesis of several partial diagnoses. Faced with this problem while building up an expert system (NEUROP), we developed a heterogeneous knowledge representation associating a finite automaton with first order logic. A number of knowledge representation problems raised by the electromyography test features are examined in this study and the expert system architecture allowing such a knowledge modeling are laid out.
Definition of evidence fusion rules on the basis of Referee Functions
This chapter defines a new concept and framework for constructing fusion rules for evidences. This framework is based on a referee function, which does a decisional arbitrament conditionally to basic decisions provided by the several sources of information. A simple sampling method is derived from this framework. The purpose of this sampling approach is to avoid the combinatorics which are inherent to the definition of fusion rules of evidences. This definition of the fusion rule by the means of a sampling process makes possible the construction of several rules on the basis of an algorithmic implementation of the referee function, instead of a mathematical formulation. Incidentally, it is a versatile and intuitive way for defining rules. The framework is implemented for various well known evidence rules. On the basis of this framework, new rules for combining evidences are proposed, which takes into account a consensual evaluation of the sources of information.
Flow of Activity in the Ouroboros Model
The Ouroboros Model is a new conceptual proposal for an algorithmic structure for efficient data processing in living beings as well as for artificial agents. Its central feature is a general repetitive loop where one iteration cycle sets the stage for the next. Sensory input activates data structures (schemata) with similar constituents encountered before, thus expectations are kindled. This corresponds to the highlighting of empty slots in the selected schema, and these expectations are compared with the actually encountered input. Depending on the outcome of this consumption analysis different next steps like search for further data or a reset, i.e. a new attempt employing another schema, are triggered. Monitoring of the whole process, and in particular of the flow of activation directed by the consumption analysis, yields valuable feedback for the optimum allocation of attention and resources including the selective establishment of useful new memory entries.
A Combinatorial Algorithm to Compute Regularization Paths
Gärtner, Bernd, Giesen, Joachim, Jaggi, Martin, Welsch, Torsten
For a wide variety of regularization methods, algorithms computing the entire solution path have been developed recently. Solution path algorithms do not only compute the solution for one particular value of the regularization parameter but the entire path of solutions, making the selection of an optimal parameter much easier. Most of the currently used algorithms are not robust in the sense that they cannot deal with general or degenerate input. Here we present a new robust, generic method for parametric quadratic programming. Our algorithm directly applies to nearly all machine learning applications, where so far every application required its own different algorithm. We illustrate the usefulness of our method by applying it to a very low rank problem which could not be solved by existing path tracking methods, namely to compute part-worth values in choice based conjoint analysis, a popular technique from market research to estimate consumers preferences on a class of parameterized options.