Not enough data to create a plot.
Try a different view from the menu above.
arXiv.org Artificial Intelligence
A Data-Parallel Version of Aleph
This is to present work on modifying the Aleph ILP system so that it evaluates the hypothesised clauses in parallel by distributing the data-set among the nodes of a parallel or distributed machine. The paper briefly discusses MPI, the interface used to access message- passing libraries for parallel computers and clusters. It then proceeds to describe an extension of YAP Prolog with an MPI interface and an implementation of data-parallel clause evaluation for Aleph through this interface. The paper concludes by testing the data-parallel Aleph on artificially constructed data-sets.
A Practical Ontology for the Large-Scale Modeling of Scholarly Artifacts and their Usage
Rodriguez, Marko A., Bollen, Johah, Van de Sompel, Herbert
The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real world instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. We present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.
Modeling Visual Information Processing in Brain: A Computer Vision Point of View and Approach
We live in the Information Age, and information has become a critically important component of our life. The success of the Internet made huge amounts of it easily available and accessible to everyone. To keep the flow of this information manageable, means for its faultless circulation and effective handling have become urgently required. Considerable research efforts are dedicated today to address this necessity, but they are seriously hampered by the lack of a common agreement about "What is information?" In particular, what is "visual information" - human's primary input from the surrounding world. The problem is further aggravated by a long-lasting stance borrowed from the biological vision research that assumes human-like information processing as an enigmatic mix of perceptual and cognitive vision faculties. I am trying to find a remedy for this bizarre situation. Relying on a new definition of "information", which can be derived from Kolmogorov's compexity theory and Chaitin's notion of algorithmic information, I propose a unifying framework for visual information processing, which explicitly accounts for the perceptual and cognitive image processing peculiarities. I believe that this framework will be useful to overcome the difficulties that are impeding our attempts to develop the right model of human-like intelligent image processing.
A preliminary analysis on metaheuristics methods applied to the Haplotype Inference Problem
Di Gaspero, Luca, Roli, Andrea
Haplotype Inference is a challenging problem in bioinformatics that consists in inferring the basic genetic constitution of diploid organisms on the basis of their genotype. This information allows researchers to perform association studies for the genetic variants involved in diseases and the individual responses to therapeutic agents. A notable approach to the problem is to encode it as a combinatorial problem (under certain hypotheses, such as the pure parsimony criterion) and to solve it using off-the-shelf combinatorial optimization techniques. The main methods applied to Haplotype Inference are either simple greedy heuristic or exact methods (Integer Linear Programming, Semidefinite Programming, SAT encoding) that, at present, are adequate only for moderate size instances. We believe that metaheuristic and hybrid approaches could provide a better scalability. Moreover, metaheuristics can be very easily combined with problem specific heuristics and they can also be integrated with tree-based search techniques, thus providing a promising framework for hybrid systems in which a good trade-off between effectiveness and efficiency can be reached. In this paper we illustrate a feasibility study of the approach and discuss some relevant design issues, such as modeling and design of approximate solvers that combine constructive heuristics, local search-based improvement strategies and learning mechanisms. Besides the relevance of the Haplotype Inference problem itself, this preliminary analysis is also an interesting case study because the formulation of the problem poses some challenges in modeling and hybrid metaheuristic solver design that can be generalized to other problems.
Bijective Faithful Translations among Default Logics
In this article, we study translations between variants of defaults logics such that the extensions of the theories that are the input and the output of the translation are in a bijective correspondence. We assume that a translation can introduce new variables and that the result of translating a theory can either be produced in time polynomial in the size of the theory or its output is polynomial in that size; we however restrict to the case in which the original theory has extensions. This study fills a gap between two previous pieces of work, one studying bijective translations among restrictions of default logics, and the other one studying non-bijective translations between default logics variants.
A Leaf Recognition Algorithm for Plant Classification Using Probabilistic Neural Network
Wu, Stephen Gang, Bao, Forrest Sheng, Xu, Eric You, Wang, Yu-Xuan, Chang, Yi-Fan, Xiang, Qiao-Liang
In this paper, we employ Probabilistic Neural Network (PNN) with image and data processing techniques to implement a general purpose automated leaf recognition algorithm. 12 leaf features are extracted and orthogonalized into 5 principal variables which consist the input vector of the PNN. The PNN is trained by 1800 leaves to classify 32 kinds of plants with an accuracy greater than 90%. Compared with other approaches, our algorithm is an accurate artificial intelligence approach which is fast in execution and easy in implementation.
Learning Probabilistic Models of Word Sense Disambiguation
This dissertation presents several new methods of supervised and unsupervised learning of word sense disambiguation models. The supervised methods focus on performing model searches through a space of probabilistic models, and the unsupervised methods rely on the use of Gibbs Sampling and the Expectation Maximization (EM) algorithm. In both the supervised and unsupervised case, the Naive Bayesian model is found to perform well. An explanation for this success is presented in terms of learning rates and bias-variance decompositions.
Practical Approach to Knowledge-based Question Answering with Natural Language Understanding and Advanced Reasoning
This research hypothesized that a practical approach in the form of a solution framework known as Natural Language Understanding and Reasoning for Intelligence (NaLURI), which combines full-discourse natural language understanding, powerful representation formalism capable of exploiting ontological information and reasoning approach with advanced features, will solve the following problems without compromising practicality factors: 1) restriction on the nature of question and response, and 2) limitation to scale across domains and to real-life natural language text.
A Generalized Information Formula as the Bridge between Shannon and Popper
A generalized information formula related to logical probability and fuzzy set is deduced from the classical information formula. The new information measure accords with to Popper's criterion for knowledge evolution very much. In comparison with square error criterion, the information criterion does not only reflect error of a proposition, but also reflects the particularity of the event described by the proposition. It gives a proposition with less logical probability higher evaluation. The paper introduces how to select a prediction or sentence from many for forecasts and language translations according to the generalized information criterion. It also introduces the rate fidelity theory, which comes from the improvement of the rate distortion theory in the classical information theory by replacing distortion (i.e. average error) criterion with the generalized mutual information criterion, for data compression and communication efficiency. Some interesting conclusions are obtained from the rate-fidelity function in relation to image communication. It also discusses how to improve Popper's theory.
Neutrality and Many-Valued Logics
Schumann, Andrew, Smarandache, Florentin
In this book, we consider various many-valued logics: standard, linear, hyperbolic, parabolic, non-Archimedean, p-adic, interval, neutrosophic, etc. We survey also results which show the tree different proof-theoretic frameworks for many-valued logics, e.g. frameworks of the following deductive calculi: Hilbert's style, sequent, and hypersequent. We present a general way that allows to construct systematically analytic calculi for a large family of non-Archimedean many-valued logics: hyperrational-valued, hyperreal-valued, and p-adic valued logics characterized by a special format of semantics with an appropriate rejection of Archimedes' axiom. These logics are built as different extensions of standard many-valued logics (namely, Lukasiewicz's, Goedel's, Product, and Post's logics). The informal sense of Archimedes' axiom is that anything can be measured by a ruler. Also logical multiple-validity without Archimedes' axiom consists in that the set of truth values is infinite and it is not well-founded and well-ordered. On the base of non-Archimedean valued logics, we construct non-Archimedean valued interval neutrosophic logic INL by which we can describe neutrality phenomena.