Plotting

 Information Technology


In-Network PCA and Anomaly Detection

Neural Information Processing Systems

We consider the problem of network anomaly detection in large distributed systems. In this setting, Principal Component Analysis (PCA) has been proposed as a method for discovering anomaliesby continuously tracking the projection of the data onto a residual subspace. This method was shown to work well empirically in highly aggregated networks, that is, those with a limited number of large nodes and at coarse time scales.


AAAI-07 Workshop Reports

AI Magazine

The AAAI-07 workshop program was held Sunday and Monday, July 22-23, in Vancouver, British Columbia, Canada. The program included the following thirteen workshops: (1) Acquiring Planning Knowledge via Demonstration; (2) Configuration; (3) Evaluating Architectures for Intelligence; (4) Evaluation Methods for Machine Learning; (5) Explanation-Aware Computing; (6) Human Implications of Human-Robot Interaction; (7) Intelligent Techniques for Web Personalization; (8) Plan, Activity, and Intent Recognition; (9) Preference Handling for Artificial Intelligence; (10) Semantic e-Science; (11) Spatial and Temporal Reasoning; (12) Trading Agent Design and Analysis; and (13) Information Integration on the Web.



The AAAI-07 Conference: Focal Point for AI Research Worldwide

AI Magazine

Horvitz noted two emerging trends at the conference and in the AI field. Second is the work in scaling AI to be more integrative. Instead of the ongoing great successes of AI researches on "wedges" of AI expertise and reasoning, there's increasing work on delivering more depth and breadth of capabilities such as sensing, learning, and reasoning. "This is very hard," notes Horvitz, "(but already) I see bits and pieces here and there." Game Playing Competition, the Poker AAI's Twenty-second Conference (AAAI-07) continued a longstanding the 1,025 attendees to choose Competition, and the Human Versus tradition of excellence.


AAAI-07 Workshop Reports

AI Magazine

The AAAI-07 workshop program was held Sunday and Monday, July 22-23, in Vancouver, British Columbia, Canada. The program included the following thirteen workshops: (1) Acquiring Planning Knowledge via Demonstration; (2) Configuration; (3) Evaluating Architectures for Intelligence; (4) Evaluation Methods for Machine Learning; (5) Explanation-Aware Computing; (6) Human Implications of Human-Robot Interaction; (7) Intelligent Techniques for Web Personalization; (8) Plan, Activity, and Intent Recognition; (9) Preference Handling for Artificial Intelligence; (10) Semantic e-Science; (11) Spatial and Temporal Reasoning; (12) Trading Agent Design and Analysis; and (13) Information Integration on the Web.


Knowware: the third star after Hardware and Software

arXiv.org Artificial Intelligence

This book proposes to separate knowledge from software and to make it a commodity that is called knowware. The architecture, representation and function of Knowware are discussed. The principles of knowware engineering and its three life cycle models: furnace model, crystallization model and spiral model are proposed and analyzed. Techniques of software/knowware co-engineering are introduced. A software component whose knowledge is replaced by knowware is called mixware. An object and component oriented development schema of mixware is introduced. In particular, the tower model and ladder model for mixware development are proposed and discussed. Finally, knowledge service and knowware based Web service are introduced and compared with Web service. In summary, knowware, software and hardware should be considered as three equally important underpinnings of IT industry. Ruqian Lu is a professor of computer science of the Institute of Mathematics, Academy of Mathematics and System Sciences. He is a fellow of Chinese Academy of Sciences. His research interests include artificial intelligence, knowledge engineering and knowledge based software engineering. He has published more than 100 papers and 10 books. He has won two first class awards from the Academia Sinica and a National second class prize from the Ministry of Science and Technology. He has also won the sixth Hua Loo-keng Mathematics Prize.


Analyzing covert social network foundation behind terrorism disaster

arXiv.org Artificial Intelligence

This paper addresses a method to analyze the covert social network foundation hidden behind the terrorism disaster. It is to solve a node discovery problem, which means to discover a node, which functions relevantly in a social network, but escaped from monitoring on the presence and mutual relationship of nodes. The method aims at integrating the expert investigator's prior understanding, insight on the terrorists' social network nature derived from the complex graph theory, and computational data processing. The social network responsible for the 9/11 attack in 2001 is used to execute simulation experiment to evaluate the performance of the method.


Using RDF to Model the Structure and Process of Systems

arXiv.org Artificial Intelligence

Many systems can be described in terms of networks of discrete elements and their various relationships to one another. A semantic network, or multi-relational network, is a directed labeled graph consisting of a heterogeneous set of entities connected by a heterogeneous set of relationships. Semantic networks serve as a promising general-purpose modeling substrate for complex systems. Various standardized formats and tools are now available to support practical, large-scale semantic network models. First, the Resource Description Framework (RDF) offers a standardized semantic network data model that can be further formalized by ontology modeling languages such as RDF Schema (RDFS) and the Web Ontology Language (OWL). Second, the recent introduction of highly performant triple-stores (i.e. semantic network databases) allows semantic network models on the order of $10^9$ edges to be efficiently stored and manipulated. RDF and its related technologies are currently used extensively in the domains of computer science, digital library science, and the biological sciences. This article will provide an introduction to RDF/RDFS/OWL and an examination of its suitability to model discrete element complex systems.


Expressive Commerce and Its Application to Sourcing: How We Conducted $35 Billion of Generalized Combinatorial Auctions

AI Magazine

It combines the advantages of highly expressive human negotiation with the advantages of electronic reverse auctions. The idea is that supply and demand are expressed in drastically greater detail than in traditional electronic auctions and are algorithmically cleared. We have hosted $35 billion of sourcing using the technology and created $4.4 billion of hard-dollar savings plus numerous harder-to-quantify benefits. The suppliers also benefited by being able to express production efficiencies and creativity, and through exposure problem removal.


Constraint-Based Random Stimuli Generation for Hardware Verification

AI Magazine

We report on random stimuli generation for hardware verification at IBM as a major applica-tion of various artificial intelligence technologies, including knowledge representation, expert systems, and constraint satisfaction. For more than a decade we have developed several related tools, with huge payoffs. Research and development around this application are still thriving, as we continue to cope with the ever-increasing complexity of modern hardware systems and demanding business environments.