If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Life on Earth has been shaped by billions of years of evolution. In fact, to be specific, it's estimated that life on Earth has existed for about 3.8 billion years. The age of the Earth is tagged at 4.543 billion years old. Thus life has, in some form or another, occupied Earth for approximately 83% of its history. What began with single-celled prokaryotic cells like bacteria, life on Earth is now teaming with more than 8.7 million different species.
Some call it "strong" AI, others "real" AI, "true" AI or artificial "general" intelligence (AGI)… whatever the term (and important nuances), there are few questions of greater importance than whether we are collectively in the process of developing generalized AI that can truly think like a human -- possibly even at a superhuman intelligence level, with unpredictable, uncontrollable consequences. This has been a recurring theme of science fiction for many decades, but given the dramatic progress of AI over the last few years, the debate has been flaring anew with particular intensity, with an increasingly vocal stream of media and conversations warning us that AGI (of the nefarious kind) is coming, and much sooner than we'd think. Latest example: the new documentary Do you trust this computer?, which streamed last weekend for free courtesy of Elon Musk, and features a number of respected AI experts from both academia and industry. The documentary paints an alarming picture of artificial intelligence, a "new life form" on planet earth that is about to "wrap its tentacles" around us. There is also an accelerating flow of stories pointing to an ever scarier aspects of AI, with reports of alternate reality creation (fake celebrity face generator and deepfakes, with full video generation and speech synthesis being likely in the near future), the ever-so-spooky Boston Dynamics videos (latest one: robots cooperating to open a door) and reports about Google's AI getting "highly aggressive" However, as an investor who spends a lot of time in the "trenches" of AI, I have been experiencing a fair amount of cognitive dissonance on this topic.
Other major firms are following suit. Microsoft has announced dedicated silicon hardware to accelerate deep-learning in its Azure cloud. And in July, the firm also revealed that its augmented reality headset, the Hololens, will have a customized chip in it to optimize machine learning applications. Apple has a long track-record of designing its own silicon for specialist requirements. Earlier this year Apple ended a relationship with Imagination Technologies, a firm that has been providing designs for GPUs in iPhones, in favor of its own GPU designs.
In 2017, artificial intelligence attracted $12 billion of VC investment. We are only beginning to discover the usefulness of AI applications. Amazon recently unveiled a brick-and-mortar grocery store that has successfully supplanted cashiers and checkout lines with computer vision, sensors, and deep learning. Between the investment, the press coverage, and the dramatic innovation, "AI" has become a hot buzzword. But does it even exist yet?
Inundated with more data than humans can analyze, the U.S. military and intelligence community are banking on machine learning and advanced computing technologies to separate the wheat from the chaff. The Defense Department operates more than 11,000 drones that collect hundreds of thousands of hours of video footage every year. "When it comes to intelligence, surveillance and reconnaissance, or ISR, we have more platforms and sensors than at any time in Department of Defense history," said Air Force Lt. Gen. John N.T. "Jack" Shanahan, director for defense intelligence (warfighter support) in the office of the undersecretary of defense for intelligence. For example, the Pentagon has deployed a wide-area motion imagery sensor that can look at an entire city. But it takes about 20 analysts working around the clock to exploit just 6 to 12 percent of the data collected, Shanahan said.
The other night I attended a dinner with a dozen CEOs of AI startups. Once again, I heard a near universal discomfort with the term "artificial intelligence" as they sipped Pinot Noir and fumbled to describe what they do. "We're not really trying to create intelligence that's artificial," said the CEO of a product strategy company. Another, who has built AI-based payment technologies, found the term dystopic. "Too many people think AI means the Terminator," he said.
Rumor has it that artificial intelligence is the next big industrial application coming out of big data, after Internet of Things. Let's take a look at the market together and examine the evolution of data processing and business intelligence. The past decade has seen the rise of some great new IT technologies including cloud computing, blockchain, and big data. Among them, big data analytics has been viewed as just a "marketing thing" generating a lot of buzz. It is now becoming a standard across industries.
When it comes to Artificial Intelligence (AI), people's responses vary: from "Terminator and Skynet are coming to kill us all" to "Will the bots take my jobs?" to "Awesome, now I can sit back and do the fun stuff while the bots take care of tedious tasks for me." But there are also misperceptions and misinformation. It's always useful to have a basic grasp of AI, because whether you like it or not, AI is already manifesting in many aspects of our lives. For instance, you can now order Domino's pizzas by talking to your phone. Plus, the pizza giant also says it is moving from a "mobile first" to an "AI first" philosophy.