If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
The events of the past few weeks have provided much fodder to this columnist. First, there was the American grandstanding on immigration, and then the events at Cognizant Technology Solutions Corp., Infosys Ltd and Tata Consultancy Services Ltd. More recently, Cloudflare Inc., which hosts information for close to two million websites, including Uber Technologies Inc. and OKCupid, had an Internet security disaster that saw the leak of passwords, cookies, and private messages from adult dating sites. This "Cloudbleed" was discovered on 17 February, 2017, but has evidently been around for many months. Decisions, decisions--what does one chew on first?
That patent, awarded April 25, 1961, recognizes Robert Noyce as the inventor of the silicon integrated circuit (IC). Integrated circuits forever changed how computers were made while adding power to a process of another kind: the growth of a then-nascent field called artificial intelligence (AI). And the potential of Noyce's invention truly took flight when he and Gordon Moore founded Intel on July 18, 1968. Fifty years later, the "eternal spring" of artificial intelligence is in full swing. To understand how we arrived, here's the truth in a nutshell: The rise of artificial intelligence is intertwined with the history of faster, more robust microprocessors.
Disruptive technologies such as artificial intelligence (AI) and encryption hold the promise of solving some of the world's most pressing issues. Future innovations relying on their use are endless: creating remote health care for the elderly and people with disabilities, for instance, or protecting our privacy and creating smart cities that can reduce waste and ease congestion. But some people look at these innovations with a deep sense of fear, envisioning a future where robots take over our jobs and eventually eclipse us. It's an understandable fear – and one that's long been popularized by the movies and the media. It's even become a polarizing battle within the tech industry itself, with Elon Musk warning about the possible misuse and militarization of AI, while tech execs, including Google's Eric Schmidt and Facebook's Mark Zuckerberg, call Musk's views misleading and alarmist.
Quantum computing promises enough computational power to solve problems far beyond the capabilities of the fastest digital computers, so the Defense Advanced Research Projects Agency is laying the groundwork for applying the technology to real-world problems. In a request for information, DARPA is asking how quantum computing can enable new capabilities when it comes to solving science and technology problems, such as understanding complex physical systems, optimizing artificial intelligence and machine learning and enhancing distributed sensing. Noting that it is not interested in solving cryptology issues, DARPA is asking the research community to help solve challenges of scale, environmental interactions, connectivity and memory and suggest "hard" science and technology problems the technology could be leveraged to solve. Establishing the fundamental limits of quantum computing in terms of how problems should be framed, when a model's scale requires a quantum-based solution, how to manage connectivity and errors, the size of potential speed gains and the ability to break large problems into smaller pieces that can map to several quantum platforms. Improving machine learning by leveraging a hybrid quantum/classical computing approach to decrease the time required to train machine learning models.
Artificial Intelligence needs a strong and reliable computational back-end to perform. Today's most extended IT systems provide just enough process power to make it work in its most basic state: AI is limited to specialized machine learning algorithms, capable of performing specific tasks in an automated way. However, and after the release of new quantum developments, qubits are ready to take over the old and modest bits, and along the way, to bringing AI to a new life level. Quantum computers, instead, upgrades the model as they are able to use qubits within the binary system. These qubits provide an amplified process power to any given task, making computational more effective and fast.
Since Alan Turing first posed the question "can machines think?" in his seminal paper in 1950, "Computing Machinery and Intelligence", Artificial Intelligence (AI) has failed to deliver on its promise. That is, Artificial General Intelligence. There have, however, been incredible advances in the field, including Deep Blue beating the world's best chess player, the birth of autonomous vehicles, and Google's DeepMind beating the world's best AlphaGo player. The current achievements represent the culmination of research and development that occurred over more than 65 years. Importantly, during this period there were two well documented AI Winters that almost completely debunked the promise of AI.
"The most important benefit of quantum computers is the speed at which it can solve complex problems," says Bansal. While they're lightning quick at what they do, Bansal notes, "they don't provide capabilities to solve problems from undecidable or NP Hard problem classes." There is a problem set that quantum computing will be able to solve, however it's not applicable for all computing problems. Typically, the problem set that quantum computers are good at solving involves number or data crunching with a huge amount of inputs, such as "complex optimisation problems and communication systems analysis problems" – calculations that would typically take supercomputers days, years, even billions of years to brute force. The application that's regularly trotted out as an example that quantum computers will be able to instantly solve is strong RSA encryption.
What are some underdeveloped areas in computer science research right now (2018)? Over the past few decades, computer science research, either in industry or academia, has led to ground breaking technology innovations such as the internet, which continues to change our lives. In the post-Moore's Law era, advances in cloud computing affected so many sub-areas of computer science like operating systems and database systems. Furthermore, solid state drives (SSDs) changed the way we design storage systems, which were previously tailored for the mechanical hard drive (HDD). Recently, quantum computing promises lightning-speed calculations as opposed to classic electronics-based computers.
There are many simulation and optimization problems that are difficult or impossible to solve using your existing computing resources. You do not have a quantum computer, which may be able to solve them, and you do not expect your company to get one soon. You are not alone, but don't worry IBM will let you use their quantum computing resources to make a start in formulating their solutions. For years, quantum computing was little more than an idea that fascinated computer scientists. Now it is offering direct utility for researchers and engineers even before the promise of a large-scale universal quantum computer is fulfilled.