If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
In 2017, artificial intelligence attracted $12 billion of VC investment. We are only beginning to discover the usefulness of AI applications. Amazon recently unveiled a brick-and-mortar grocery store that has successfully supplanted cashiers and checkout lines with computer vision, sensors, and deep learning. Between the investment, the press coverage, and the dramatic innovation, "AI" has become a hot buzzword. But does it even exist yet?
Inundated with more data than humans can analyze, the U.S. military and intelligence community are banking on machine learning and advanced computing technologies to separate the wheat from the chaff. The Defense Department operates more than 11,000 drones that collect hundreds of thousands of hours of video footage every year. "When it comes to intelligence, surveillance and reconnaissance, or ISR, we have more platforms and sensors than at any time in Department of Defense history," said Air Force Lt. Gen. John N.T. "Jack" Shanahan, director for defense intelligence (warfighter support) in the office of the undersecretary of defense for intelligence. For example, the Pentagon has deployed a wide-area motion imagery sensor that can look at an entire city. But it takes about 20 analysts working around the clock to exploit just 6 to 12 percent of the data collected, Shanahan said.
The other night I attended a dinner with a dozen CEOs of AI startups. Once again, I heard a near universal discomfort with the term "artificial intelligence" as they sipped Pinot Noir and fumbled to describe what they do. "We're not really trying to create intelligence that's artificial," said the CEO of a product strategy company. Another, who has built AI-based payment technologies, found the term dystopic. "Too many people think AI means the Terminator," he said.
Rumor has it that artificial intelligence is the next big industrial application coming out of big data, after Internet of Things. Let's take a look at the market together and examine the evolution of data processing and business intelligence. The past decade has seen the rise of some great new IT technologies including cloud computing, blockchain, and big data. Among them, big data analytics has been viewed as just a "marketing thing" generating a lot of buzz. It is now becoming a standard across industries.
When it comes to Artificial Intelligence (AI), people's responses vary: from "Terminator and Skynet are coming to kill us all" to "Will the bots take my jobs?" to "Awesome, now I can sit back and do the fun stuff while the bots take care of tedious tasks for me." But there are also misperceptions and misinformation. It's always useful to have a basic grasp of AI, because whether you like it or not, AI is already manifesting in many aspects of our lives. For instance, you can now order Domino's pizzas by talking to your phone. Plus, the pizza giant also says it is moving from a "mobile first" to an "AI first" philosophy.
The term Artificial Intelligence in 2017 feels a bit like cloud computing back in 2010 – it's the hot buzz phrase of the moment and it's being broadly over-applied and mis-applied in industry. I find that I can get asked about artificial intelligence, machine learning, cognitive computing, machine intelligence and advanced analytics all in the same meeting – and folks can end up using many of these terms almost interchangeably. Sometimes I want to carry around an AI buzzword bingo card. Over the recent holiday, several of us were brainstorming trying to provide some more definition to all of this. I've seen a definition of machine intelligence from Gartner Group that I very much like.
Get the O'Reilly AI Newsletter and receive weekly AI news and insights from industry insiders. The following piece was first published in the AI newsletter. A recent Forrester survey of business and technology professionals found that 58% of them are researching AI, but only 12% are using AI systems. This is partially because applied AI applications are only now starting to be realized, but it's also because right now AI is hard. It requires very specialized skills and a develop-it-yourself attitude.
We've heard that it is prohibitively expensive for startups and academics to train machine learning models, and this is due to the rental or purchase costs of hardware. The results from one recent Google paper were estimated to cost $13k to emulate. That's just to reproduce the final model, not to emulate the whole experimentation and hyperparameter optimisation caboodle. Equally, there are intelligence tasks (training, inference, or prediction) that would ideally happen on the cellphone or remote sensor but are too compute constrained locally, so currently rely on uploading data to the cloud for processing. Machine intelligence is the future of computing, so what needs to happen at a hardware level to make it faster and more energy- and cost-efficient?