Goto

Collaborating Authors

 Country


Dimension reduction in representation of the data

arXiv.org Machine Learning

Suppose the data consist of a set $S$ of points $x_j$, $1\leq j \leq J$, distributed in a bounded domain $D\subset R^N$, where $N$ is a large number. An algorithm is given for finding the sets $L_k$ of dimension $k\ll N$, $k=1,2,...K$, in a neighborhood of which maximal amount of points $x_j\in S$ lie. The algorithm is different from PCA (principal component analysis)


Domain Adaptation: Learning Bounds and Algorithms

arXiv.org Artificial Intelligence

This paper addresses the general problem of domain adaptation which arises in a variety of applications where the distribution of the labeled sample available somewhat differs from that of the test data. Building on previous work by Ben-David et al. (2007), we introduce a novel distance between distributions, discrepancy distance, that is tailored to adaptation problems with arbitrary loss functions. We give Rademacher complexity bounds for estimating the discrepancy distance from finite samples for different loss functions. Using this distance, we derive novel generalization bounds for domain adaptation for a wide family of loss functions. We also present a series of novel adaptation bounds for large classes of regularization-based algorithms, including support vector machines and kernel ridge regression based on the empirical discrepancy. This motivates our analysis of the problem of minimizing the empirical discrepancy for various loss functions for which we also give novel algorithms. We report the results of preliminary experiments that demonstrate the benefits of our discrepancy minimization algorithms for domain adaptation.


Full First-Order Sequent and Tableau Calculi With Preservation of Solutions and the Liberalized delta-Rule but Without Skolemization

arXiv.org Artificial Intelligence

We present a combination of raising, explicit variable dependency representation, the liberalized delta-rule, and preservation of solutions for first-order deductive theorem proving. Our main motivation is to provide the foundation for our work on inductive theorem proving, where the preservation of solutions is indispensable.


lim+, delta+, and Non-Permutability of beta-Steps

arXiv.org Artificial Intelligence

Using a human-oriented formal example proof of the (lim+) theorem, i.e. that the sum of limits is the limit of the sum, which is of value for reference on its own, we exhibit a non-permutability of beta-steps and delta+-steps (according to Smullyan's classification), which is not visible with non-liberalized delta-rules and not serious with further liberalized delta-rules, such as the delta++-rule. Besides a careful presentation of the search for a proof of (lim+) with several pedagogical intentions, the main subject is to explain why the order of beta-steps plays such a practically important role in some calculi.


Syntactic Confluence Criteria for Positive/Negative-Conditional Term Rewriting Systems

arXiv.org Artificial Intelligence

We study the combination of the following already known ideas for showing confluence of unconditional or conditional term rewriting systems into practically more useful confluence criteria for conditional systems: Our syntactical separation into constructor and non-constructor symbols, Huet's introduction and Toyama's generalization of parallel closedness for non-noetherian unconditional systems, the use of shallow confluence for proving confluence of noetherian and non-noetherian conditional systems, the idea that certain kinds of limited confluence can be assumed for checking the fulfilledness or infeasibility of the conditions of conditional critical pairs, and the idea that (when termination is given) only prime superpositions have to be considered and certain normalization restrictions can be applied for the substitutions fulfilling the conditions of conditional critical pairs. Besides combining and improving already known methods, we present the following new ideas and results: We strengthen the criterion for overlay joinable noetherian systems, and, by using the expressiveness of our syntactical separation into constructor and non-constructor symbols, we are able to present criteria for level confluence that are not criteria for shallow confluence actually and also able to weaken the severe requirement of normality (stiffened with left-linearity) in the criteria for shallow confluence of noetherian and non-noetherian conditional systems to the easily satisfied requirement of quasi-normality. Finally, the whole paper may also give a practically useful overview of the syntactical means for showing confluence of conditional term rewriting systems.


An Algebraic Dexter-Based Hypertext Reference Model

arXiv.org Artificial Intelligence

We present the first formal algebraic specification of a hypertext reference model. It is based on the well-known Dexter Hypertext Reference Model and includes modifications with respect to the development of hypertext since the WWW came up. Our hypertext model was developed as a product model with the aim to automatically support the design process and is extended to a model of hypertext-systems in order to be able to describe the state transitions in this process. While the specification should be easy to read for non-experts in algebraic specification, it guarantees a unique understanding and enables a close connection to logic-based development and verification.


A Systematic Approach to Artificial Agents

arXiv.org Artificial Intelligence

Agents and agent systems are becoming more and more important in the development of a variety of fields such as ubiquitous computing, ambient intelligence, autonomous computing, intelligent systems and intelligent robotics. The need for improvement of our basic knowledge on agents is very essential. We take a systematic approach and present extended classification of artificial agents which can be useful for understanding of what artificial agents are and what they can be in the future. The aim of this classification is to give us insights in what kind of agents can be created and what type of problems demand a specific kind of agents for their solution.


Lanczos Approximations for the Speedup of Kernel Partial Least Squares Regression

arXiv.org Machine Learning

The runtime for Kernel Partial Least Squares (KPLS) to compute the fit is quadratic in the number of examples. However, the necessity of obtaining sensitivity measures as degrees of freedom for model selection or confidence intervals for more detailed analysis requires cubic runtime, and thus constitutes a computational bottleneck in real-world data analysis. We propose a novel algorithm for KPLS which not only computes (a) the fit, but also (b) its approximate degrees of freedom and (c) error bars in quadratic runtime. The algorithm exploits a close connection between Kernel PLS and the Lanczos algorithm for approximating the eigenvalues of symmetric matrices, and uses this approximation to compute the trace of powers of the kernel matrix in quadratic runtime.


Symbolic Computing with Incremental Mindmaps to Manage and Mine Data Streams - Some Applications

arXiv.org Artificial Intelligence

In our understanding, a mind-map is an adaptive engine that basically works incrementally on the fundament of existing transactional streams. Generally, mind-maps consist of symbolic cells that are connected with each other and that become either stronger or weaker depending on the transactional stream. Based on the underlying biologic principle, these symbolic cells and their connections as well may adaptively survive or die, forming different cell agglomerates of arbitrary size. In this work, we intend to prove mind-maps' eligibility following diverse application scenarios, for example being an underlying management system to represent normal and abnormal traffic behaviour in computer networks, supporting the detection of the user behaviour within search engines, or being a hidden communication layer for natural language interaction.


ASF+ --- eine ASF-aehnliche Spezifikationssprache

arXiv.org Artificial Intelligence

Maintaining the main aspects of the algebraic specification language ASF as presented in [Bergstra&al.89] we have extend ASF with the following concepts: While once exported names in ASF must stay visible up to the top the module hierarchy, ASF+ permits a more sophisticated hiding of signature names. The erroneous merging of distinct structures that occurs when importing different actualizations of the same parameterized module in ASF is avoided in ASF+ by a more adequate form of parameter binding. The new ``Namensraum''-concept of ASF+ permits the specifier on the one hand directly to identify the origin of hidden names and on the other to decide whether an imported module is only to be accessed or whether an important property of it is to be modified. In the first case he can access one single globally provided version; in the second he has to import a copy of the module. Finally ASF+ permits semantic conditions on parameters and the specification of tasks for a theorem prover.