If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
If science-fiction movies have taught us anything, it's that the future is a bleak and terrifying dystopia ruled by murderous sentient robots. Fortunately, only one of these things is true – but that could soon change, as the doomsayers are so fond of telling us. Artificial intelligence and machine learning are among the most significant technological developments in recent history. Few fields promise to "disrupt" (to borrow a favored term) life as we know it quite like machine learning, but many of the applications of machine learning technology go unseen. Want to see some real examples of machine learning in action?
IBM is ready to open the doors to the first customers for its commercial quantum-computing services. Sometimes the most profound solution is to change the entire problem. JPMorgan Chase, Daimler, and Samsung will be among the first group of businesses to gain access to IBM's new 20 quantum bit (qubit) IBM Q quantum system to help uncover commercial, industrial and scientific quantum computing applications. While IBM is playing catchup with artificial-intelligence breakthroughs made by Google, Microsoft, and Amazon, it has taken a lead in the race to build a practical quantum computer thanks to its recently demonstrated prototype 50-qubit processor. The companies joining the IBM Q Network gain access to the 20-qubit system, which is capable of producing qubits with a record 90 microsecond'coherence', the time a single qubit -- representing both 1 and 0 simultaneously -- survives in this state before dissolving into a conventional bit's single state of either 1 or 0. That means 50 qubits can represent more than one thousand trillion values.
Just as blockchain technology is being aligned with the Internet of Things (IoT), it is also increasingly being mentioned by those involved in advancing artificial intelligence (AI). Indeed, some - including legacy institutions like IBM and SAP - see a future involving the convergence of all of these technologies. Unlike blockchain technology and IoT, AI - which, in one sense, is about creating computer applications that act as smart as humans - is not a new concept. Research began in academia in the 1950s, and the subject was popularized in the 1968 science fiction movie "2001: A Space Odyssey," featuring the humanlike HAL 9000 computer. Usable computing systems running AI programs emerged in the 1980s, in the form of expert systems that were able to apply pre-programmed knowledge and make rules-based business decisions.
We have had many previous hype cycles around AI. As I wrote in Silicon Collar: "Since the 1950s! That is when Alan Turing defined his famous test to measure a machine's ability to exhibit intelligent behavior equivalent to that of a human. In 1959, we got excited when Allen Newell and his colleagues coded the General Problem Solver. In 1968, Stanley Kubrick sent our minds into overdrive with HAL in his movie, 2001: A Space Odyssey.
Open any business publication or digital journal today, and you will read about the promise of AI, known as artificial or augmented intelligence, and how it will transform your business. The fact is, AI will not only transform your entire business, whether you are in health care, finance, retail or manufacturing, but it will also transform technology itself. The essential task of information technology (IT), and how we measure its value, has reached an inflection point. It's no longer just about process automation and codifying business logic. Instead, insight is the new currency, and the speed with which we can scale that insight and the knowledge it brings is the basis for value creation and the key to competitive advantage.
IBM researchers have developed a program that can predict the products of organic chemistry reactions.1 Modelled on the latest language translation systems – like Google's artificial neural network – the AI picked the right product 80% of the time despite not having been taught any organic chemistry rules. 'What this tool is trying to do is imitate a top pro chemist in more or less the entire domain of organic chemistry,' says Teodoro Laino, one of the researchers involved in the study at IBM in Zurich, Switzerland. His ambitious goal is shared by other chemists who have been attempting to create a functioning AI chemist since the 1970s, when organic chemist E J Corey kick-started the field by creating a chemical knowledge database. However, making a tool based on chemistry knowledge can be time-consuming; Bartosz Grzybowski's team took 10 years to encode their Chematica retrosynthesis program with 20,000 chemical rules. Moreover, a knowledge-based AI has difficulty tackling reactions that lie outside of its rule set.
When living and operating in a market largely dominated by a vendor that isn't you, the strategy you must deploy is one of focus. In the early days of Power, IBM tried to take on Intel head to head and that just wasn't working. You can understand why IBM thought it could do this; it was once the most powerful company in the world. But, like Microsoft, Intel's strength largely came from providing technology to firms like IBM, and IBM's decline in the late 1980s and early 1990s not only weakened it substantially, it collectively strengthened other firms. Much like AMD, which has always been weaker than Intel, IBM needed to pick its battles, and given that the company still pretty much owns the market for enterprise-class AI with Watson, and that this segment is slated to become the most lucrative in the industry for servers over the next decade, it chose wisely to make this one of its critical areas of focus.
One of the toughest aspects of having epilepsy is not knowing when the next seizure will strike. A wearable warning system that detects pre-seizure brain activity and alerts people of its onset could alleviate some of that stress and make the disorder more manageable. To that end, IBM researchers say they have developed a portable chip that can do the job; they described their invention today in the Lancet's open access journal eBioMedicine. The scientists built the system on a mountain of brainwave data collected from epilepsy patients. The dataset, reported by a separate group in 2013, included over 16 years of continuous electroencephalography (EEG) recordings of brain activity, and thousands of seizures, from patients who had had electrodes surgically implanted in their brains.
Jack Danahy is the co-founder and CTO of Barkly, an endpoint protection platform that is transforming the way businesses protect endpoints. A 25-year innovator in computer, network and data security, Jack was previously the founder and CEO of two successful security companies: Qiave Technologies (acquired by Watchguard Technologies in 2000) and Ounce Labs (acquired by IBM in 2009). Jack is a frequent writer and speaker on security and security issues, and has received multiple patents in a variety of security technologies. Prior to founding Barkly, he was the Director of Advanced Security for IBM, and led the delivery of security services for IBM in North America.
Companies running AI applications often need as much computing muscle as researchers who use supercomputers do. IBM's latest system is aimed at both audiences. The company last week introduced its first server powered by the new Power9 processor designed for AI and high-performance computing. The powerful technologies inside have already attracted the likes of Google and the US Department of Energy as customers. The new IBM Power System AC922 is equipped with two Power9 CPUs and from two to six NVIDIA Tesla V100 GPUs.