Country
Gaussian Belief with dynamic data and in dynamic network
In this paper we analyse Belief Propagation over a Gaussian model in a dynamic environment. Recently, this has been proposed as a method to average local measurement values by a distributed protocol ("Consensus Propagation", Moallemi & Van Roy, 2006), where the average is available for read-out at every single node. In the case that the underlying network is constant but the values to be averaged fluctuate ("dynamic data"), convergence and accuracy are determined by the spectral properties of an associated Ruelle-Perron-Frobenius operator. For Gaussian models on Erdos-Renyi graphs, numerical computation points to a spectral gap remaining in the large-size limit, implying exceptionally good scalability. In a model where the underlying network also fluctuates ("dynamic network"), averaging is more effective than in the dynamic data case. Altogether, this implies very good performance of these methods in very large systems, and opens a new field of statistical physics of large (and dynamic) information systems.
Deformed Statistics Formulation of the Information Bottleneck Method
Venkatesan, R. C., Plastino, A.
The theoretical basis for a candidate variational principle for the information bottleneck (IB) method is formulated within the ambit of the generalized nonadditive statistics of Tsallis. Given a nonadditivity parameter $ q $, the role of the \textit{additive duality} of nonadditive statistics ($ q^*=2-q $) in relating Tsallis entropies for ranges of the nonadditivity parameter $ q < 1 $ and $ q > 1 $ is described. Defining $ X $, $ \tilde X $, and $ Y $ to be the source alphabet, the compressed reproduction alphabet, and, the \textit{relevance variable} respectively, it is demonstrated that minimization of a generalized IB (gIB) Lagrangian defined in terms of the nonadditivity parameter $ q^* $ self-consistently yields the \textit{nonadditive effective distortion measure} to be the \textit{$ q $-deformed} generalized Kullback-Leibler divergence: $ D_{K-L}^{q}[p(Y|X)||p(Y|\tilde X)] $. This result is achieved without enforcing any \textit{a-priori} assumptions. Next, it is proven that the $q^*-deformed $ nonadditive free energy of the system is non-negative and convex. Finally, the update equations for the gIB method are derived. These results generalize critical features of the IB method to the case of Tsallis statistics.
FaceBots: Steps Towards Enhanced Long-Term Human-Robot Interaction by Utilizing and Publishing Online Social Information
Mavridis, Nikolaos, Emami, Shervin, Datta, Chandan, Kamzi, Wajahat, BenAbdelkader, Chiraz, Toulis, Panos, Tanoto, Andry, Rabie, Tamer
Our project aims at supporting the creation of sustainable and meaningful longer-term human-robot relationships through the creation of embodied robots with face recognition and natural language dialogue capabilities, which exploit and publish social information available on the web (Facebook). Our main underlying experimental hypothesis is that such relationships can be significantly enhanced if the human and the robot are gradually creating a pool of shared episodic memories that they can co-refer to (shared memories), and if they are both embedded in a social web of other humans and robots they both know and encounter (shared friends). In this paper, we are presenting such a robot, which as we will see achieves two significant novelties.
Adaptive Learning with Binary Neurons
Torres-Moreno, Juan-Manuel, Gordon, Mirta B.
A efficient incremental learning algorithm for classification tasks, called NetLines, well adapted for both binary and real-valued input patterns is presented. It generates small compact feedforward neural networks with one hidden layer of binary units and binary output units. A convergence theorem ensures that solutions with a finite number of hidden units exist for both binary and real-valued input patterns. An implementation for problems with more than two classes, valid for any binary classifier, is proposed. The generalization error and the size of the resulting networks are compared to the best published results on well-known classification benchmarks. Early stopping is shown to decrease overfitting, without improving the generalization performance.
Characterizations of Stable Model Semantics for Logic Programs with Arbitrary Constraint Atoms
Shen, Yi-Dong, You, Jia-Huai, Yuan, Li-Yan
This paper studies the stable model semantics of logic programs with (abstract) constraint atoms and their properties. We introduce a succinct abstract representation of these constraint atoms in which a constraint atom is represented compactly. We show two applications. First, under this representation of constraint atoms, we generalize the Gelfond-Lifschitz transformation and apply it to define stable models (also called answer sets) for logic programs with arbitrary constraint atoms. The resulting semantics turns out to coincide with the one defined by Son et al., which is based on a fixpoint approach. One advantage of our approach is that it can be applied, in a natural way, to define stable models for disjunctive logic programs with constraint atoms, which may appear in the disjunctive head as well as in the body of a rule. As a result, our approach to the stable model semantics for logic programs with constraint atoms generalizes a number of previous approaches. Second, we show that our abstract representation of constraint atoms provides a means to characterize dependencies of atoms in a program with constraint atoms, so that some standard characterizations and properties relying on these dependencies in the past for logic programs with ordinary atoms can be extended to logic programs with constraint atoms.
Semantic Social Network Analysis
Erétéo, Guillaume, Gandon, Fabien, Corby, Olivier, Buffa, Michel
Social Network Analysis (SNA) tries to understand and exploit the key features of social networks in order to manage their life cycle and predict their evolution. Increasingly popular web 2.0 sites are forming huge social network. Classical methods from social network analysis (SNA) have been applied to such online networks. In this paper, we propose leveraging semantic web technologies to merge and exploit the best features of each domain. We present how to facilitate and enhance the analysis of online social networks, exploiting the power of semantic social network analysis.
Considerations upon the Machine Learning Technologies
Munteanu, Alin, Sofran, Cristina Ofelia
Artificial intelligence offers superior techniques and methods by which problems from diverse domains may find an optimal solution. The Machine Learning technologies refer to the domain of artificial intelligence aiming to develop the techniques allowing the computers to "learn". Some systems based on Machine Learning technologies tend to eliminate the necessity of the human intelligence while the others adopt a man-machine collaborative approach.
Variations of the Turing Test in the Age of Internet and Virtual Reality
Neumann, Florentin, Reichenberger, Andrea, Ziegler, Martin
Inspired by Hofstadter's Coffee-House Conversation (1982) and by the science fiction short story SAM by Schattschneider (1988), we propose and discuss criteria for non-mechanical intelligence. Firstly, we emphasize the practical need for such tests in view of massively multiuser online role-playing games (MMORPGs) and virtual reality systems like Second Life. Secondly, we demonstrate Second Life as a useful framework for implementing (some iterations of) that test.
Lexicographic probability, conditional probability, and nonstandard probability
The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (LPS's), and nonstandard probability spaces (NPS's) is considered. If countable additivity is assumed, Popper spaces and a subclass of LPS's are equivalent; without the assumption of countable additivity, the equivalence no longer holds. If the state space is finite, LPS's are equivalent to NPS's. However, if the state space is infinite, NPS's are shown to be more general than LPS's.
Non-Negative Matrix Factorization, Convexity and Isometry
Vasiloglou, Nikolaos, Gray, Alexander G., Anderson, David V.
In this paper we explore avenues for improving the reliability of dimensionality reduction methods such as Non-Negative Matrix Factorization (NMF) as interpretive exploratory data analysis tools. We first explore the difficulties of the optimization problem underlying NMF, showing for the first time that non-trivial NMF solutions always exist and that the optimization problem is actually convex, by using the theory of Completely Positive Factorization. We subsequently explore four novel approaches to finding globally-optimal NMF solutions using various ideas from convex optimization. We then develop a new method, isometric NMF (isoNMF), which preserves non-negativity while also providing an isometric embedding, simultaneously achieving two properties which are helpful for interpretation. Though it results in a more difficult optimization problem, we show experimentally that the resulting method is scalable and even achieves more compact spectra than standard NMF.