Virginia Polytechnic Institute and State University
Reports on the 2013 AAAI Fall Symposium Series
Burns, Gully (Information Sciences Institute, University of Southern California) | Gil, Yolanda (Information Sciences Institute and Department of Computer Science, University of Southern California) | Liu, Yan (University of Southern California) | Villanueva-Rosales, Natalia (University of Texas at El Paso) | Risi, Sebastian (University of Copenhagen) | Lehman, Joel (University of Texas at Austin) | Clune, Jeff (University of Wyoming) | Lebiere, Christian (Carnegie Mellon University) | Rosenbloom, Paul S. (University of Southern California) | Harmelen, Frank van (Vrije Universiteit Amsterdam) | Hendler, James A. (Rensselaer Polytechnic Institute) | Hitzler, Pascal (Wright State University) | Janowic, Krzysztof (University of California, Santa Barbara) | Swarup, Samarth (Virginia Polytechnic Institute and State University)
The Association for the Advancement of Artificial Intelligence was pleased to present the 2013 Fall Symposium Series, held Friday through Sunday, November 15–17, at the Westin Arlington Gateway in Arlington, Virginia near Washington DC USA. The titles of the five symposia were as follows: Discovery Informatics: AI Takes a Science-Centered View on Big Data (FS-13-01); How Should Intelligence be Abstracted in AI Research: MDPs, Symbolic Representations, Artificial Neural Networks, or --? The highlights of each symposium are presented in this report.
Reports on the 2013 AAAI Fall Symposium Series
Burns, Gully (Information Sciences Institute, University of Southern California) | Gil, Yolanda (Information Sciences Institute and Department of Computer Science, University of Southern California) | Liu, Yan (University of Southern California) | Villanueva-Rosales, Natalia (University of Texas at El Paso) | Risi, Sebastian (University of Copenhagen) | Lehman, Joel (University of Texas at Austin) | Clune, Jeff (University of Wyoming) | Lebiere, Christian (Carnegie Mellon University) | Rosenbloom, Paul S. (University of Southern California) | Harmelen, Frank van (Vrije Universiteit Amsterdam) | Hendler, James A. (Rensselaer Polytechnic Institute) | Hitzler, Pascal (Wright State University) | Janowic, Krzysztof (University of California, Santa Barbara) | Swarup, Samarth (Virginia Polytechnic Institute and State University)
Rinke Hoekstra (VU University from transferring and adapting semantic web Amsterdam) presented linked open data tools technologies to the big data quest. Finally, in the Social to discover connections within established scientific Networks and Social Contagion symposium, a data sets. Louiqa Rashid (University of Maryland) community of researchers explored topics such as social presented work on similarity metrics linking together contagion, game theory, network modeling, network-based drugs, genes, and diseases. Kyle Ambert (Intel) presented inference, human data elicitation, and Finna, a text-mining system to identify passages web analytics. Highlights of the symposia are contained of interest containing descriptions of neuronal in this report.
Hybrid Intelligence for Semantics-Enhanced Networking Operations
Mokhtar, Bassem (Virginia Polytechnic Institute and State University) | Eltoweissy, Mohamed (Virginia Military Institute)
Endowing the semantically-oblivious Internet with Intelligence would advance the Internet capability to learn traffic behavior and to predict future events. In this paper, we propose a hybrid intelligence memory system, or NetMem, for network-semantics reasoning and targeting Internet intelligence. NetMem provides a memory structure, mimicking the human memory functionalities, via short-term memory (StM) and long-term memory (LtM). NetMem has the capability to build runtime accessible dynamic network-concept ontology (DNCO) at different levels of granularity. We integrate Latent Dirichlet Allocation (LDA) and Hidden Markov Models (HMM) to extract network-semantics based on learning patterns and recognizing features with syntax and semantic dependencies. Due to the large scale and high-dimensionality of Internet data, we utilize the Locality Sensitive Hashing (LSH) algorithm for data dimensionality reduction. Simulation results using real network traffic show that NetMem with hybrid intelligence learn traffic data semantics effectively and efficiently even with significant reduction in volume and dimensionality of data, thus enhancing Internet intelligence for self-/situation-awareness and event/behavior prediction.
Discovering Life Cycle Assessment Trees from Impact Factor Databases
Sundaravaradan, Naren (Virginia Polytechnic Institute and State University) | Patnaik, Debprakash (Virginia Polytechnic Institute and State University) | Ramakrishnan, Naren (Virginia Polytechnic Institute and State University) | Marwah, Manish (HP Labs Palo Alto, CA) | Shah, Amip (HP Labs Palo Alto, CA)
In recent years, environmental sustainability has received widespread attention due to continued depletion of natural resources and degradation of the environment. Life cycle assessment (LCA) is a methodology for quantifying multiple environmental impacts of a product, across its entire life cycle — from creation to use to discard. The key object of interest in LCA is the inventory tree, with the desired product as the root node and the materials and processes used across its life cycle as the children. The total impact of the parent in any environmental category is a linear combination of the impacts of the children in that category. LCA has generally been used in "forward: mode: given an inventory tree and impact factors of its children, the task is to compute the impact factors of the root, i.e., the product being modeled. We propose a data mining approach to solve the inverse problem, where the task is to infer inventory trees from a database of environmental factors. This is an important problem with applications in not just understanding what parts and processes constitute a product but also in designing and developing more sustainable alternatives. Our solution methodology is one of feature selection but set in the context of a non-negative least squares problem. It organizes numerous non-negative least squares fits over the impact factor database into a set of pairwise membership relations which are then summarized into candidate trees in turn yielding a consensus tree. We demonstrate the applicability of our approach over real LCA datasets obtained from a large computer manufacturer.
Estimating the Impact of Public and Private Strategies for Controlling an Epidemic: A Multi-Agent Approach
Barrett, Christopher L. (Virginia Polytechnic Institute and State University) | Bisset, Keith (Virginia Polytechnic Institute and State University) | Leidig, Jonathan (Virginia Polytechnic Institute and State University) | Marathe, Achla (Virginia Polytechnic Institute and State University) | Marathe, Madhav (Virginia Polytechnic Institute and State University)
This paper describes a novel approach based on a combination of techniques in AI, parallel computing, and network science to address an important problem in social sciences and public health: planning and responding in the event of epidemics. Spread of infectious disease is an important societal problem -- human behavior, social networks, and the civil infrastructures all play a crucial role in initiating and controlling such epidemic processes. We specifically consider the economic and social effects of realistic interventions proposed and adopted by public health officials and behavioral changes of private citizens in the event of a ``flu-like'' epidemic. Our results provide new insights for developing robust public policies that can prove useful for epidemic planning.