A history of branch prediction from 1500000 BC to 1995

#artificialintelligence

Hardware is quite different from this in that there are forces that push back against complexity. Every chunk of hardware you implement costs money, so you want to implement as little hardware as possible.


A Parameterized Complexity View on Description Logic Reasoning

AAAI Conferences

Description logics are knowledge representation languages that have been designed to strike a balance between expressivity and computational tractability. Many different description logics have been developed, and numerous computational problems for these logics have been studied for their computational complexity. However, essentially all complexity analyses of reasoning problems for description logics use the one-dimensional framework of classical complexity theory. The multi-dimensional framework of parameterized complexity theory is able to provide a much more detailed image of the complexity of reasoning problems. In this paper we argue that the framework of parameterized complexity has a lot to offer for the complexity analysis of description logic reasoning problems---when one takes a progressive and forward-looking view on parameterized complexity tools. We substantiate our argument by means of three case studies. The first case study is about the problem of concept satisfiability for the logic ALC with respect to nearly acyclic TBoxes. The second case study concerns concept satisfiability for ALC concepts parameterized by the number of occurrences of union operators and the number of occurrences of full existential quantification. The third case study offers a critical look at data complexity results from a parameterized complexity point of view. These three case studies are representative for the wide range of uses for parameterized complexity methods for description logic problems.


Application of Metric Measures: From Conventional Software to Expert Systems

AAAI Conferences

The importance of metric measures has been recognized in almost every scientific and engineering discipline, including software engineering where much progress in this application has been made. In this paper, metric measures are studied for their application to expert systems, for which little work has been done. The characteristics and organization of metric measures are discussed and presented. Due to the analogy between conventional software and expert system, a comparative study is also conducted. A new expert system metric is proposed. Test results indicate that this new measure compares favorably with others.


The March Into the Black Hole of Complexity

Communications of the ACM

In June 2002, communications published my Viewpoint "Rebirth of the Computer Industry," in which I expressed hope that the past complexity sins of the computer industry had been admitted and that something positive could happen.5 Instead, complexity has increased at an accelerated and alarming rate. I pointed to many fundamental problems in my previous Viewpoint; here, I emphasize two aspects: the hardware-software mismatch and the state of affairs in developing and sustaining software systems. Both are contributing root causes introducing enormous risks and impacting digital-age safety, security, and integrity. If the world ever hopes to climb out of the black hole of software complexity it will involve addressing these aspects in a constructive manner.


VMwareVoice: Transforming Security: The Principle Of Least Privilege

Forbes - Tech

Given the size and complexity of modern IT infrastructures, Bass and Corn both agree that an old principle of cyber hygiene, least privilege, is the foundation for a new approach to understanding security risks and how to mitigate them. Least privilege is the concept that an application or service--or on the end-user computing side, a user or device--should only have access to the information or resources that are necessary for its legitimate purpose. It is a principle that promises to unify the approach to improving both end-user and data center security. It focuses the organization on the real risk--the applications and data--and on containing and shrinking that risk. It improves signal to noise, and it helps reduce the complexity causing misalignment and misconfigurations, which are at the heart of so many breaches.