If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
There's no question as to who the real technology star is now: it's you. Your voice is what hundreds of companies are vying to attract, with thousands of new products calling out for you to talk to them. Voice-activated technology has erupted over the last 12 months since Amazon's Alexa was informally crowned breakout technology champion of the CES 2017 consumer tech show. Seemingly by stealth, Amazon had snuck Alexa into a dizzying array of products and everywhere you turned, there she was. Alexa was the name on everyone's lips – literally – and Amazon had achieved this near-ubiquitous name- recognition without even having a stand at the gargantuan annual gadget-fest in Las Vegas.
Artificial intelligence is already making significant inroads in taking over mundane, time-consuming tasks many humans would rather not do. The responsibilities and consequences of handing over work to AI vary greatly, though; some autonomous systems recommend music or movies; others recommend sentences in court. Even more advanced AI systems will increasingly control vehicles on crowded city streets, raising questions about safety--and about liability, when the inevitable accidents occur. But philosophical arguments over AI's existential threats to humanity are often far removed from the reality of actually building and using the technology in question. Deep learning, machine vision, natural language processing--despite all that has been written and discussed about these and other aspects of artificial intelligence, AI is still at a relatively early stage in its development.
With a wave of investment, a raft of new products, and a rising tide of enterprise deployments, artificial intelligence is making a splash in the Internet of Things (IoT). Companies crafting an IoT strategy, evaluating a potential new IoT project, or seeking to get more value from an existing IoT deployment may want to explore a role for AI. Artificial intelligence is playing a growing role in IoT applications and deployments,12 a shift apparent in the behavior of companies operating in this area. Venture capital investments in IoT start-ups that are using AI are up sharply. Companies have acquired dozens of firms working at the intersection of AI and IoT in the last two years.
Artificial intelligence (AI) gives machines the ability to "think" and accomplish tasks. AI already is a big part of our lives in areas such as banking, shopping, security and healthcare. Soon it will help us get around in automated vehicles. By 2025, the global enterprise AI market is predicted to be worth more than $30 billion. Israeli industry can expect a nice piece of that pie due to its world-class capabilities in AI and its subsets: big-data analysis, natural-language processing, computer vision, machine learning and deep learning.
The adoption of AI in business and society is being spurred on by tech giants with resources to design, build and roll out services affordable and simple enough for everyday use. Microsoft is one of those at the forefront. This year, the words "artificial intelligence" appeared in a vision statement for the first time, reaffirming that smart, learning machines are considered central to everything they do. While it may only just be beginning to shout about it, Microsoft has been building intelligent functionality into many of its products and services for some time. If you regularly use Skype, Office 365, Cortana or Bing, you have probably come across them.
According to Gartner, over 85% of customer interactions will be managed without a human by 2020. We have seen a machine master the complex game of Go, previously thought to be the most difficult challenge of artificial processing. We have witnessed vehicles operating autonomously, including a caravan of trucks crossing Europe with only a single operator to monitor systems. We have seen a proliferation of robotic counterparts and automated means for accomplishing a variety of tasks and all of this has given rise to a flurry of people claiming that the Artificial Intelligence revolution is already upon us. However, while there is no doubt that there have been significant advancements in the field of AI, what we have seen is only a start on the path to what could be considered full AI.
The capability to teach machines to interpret data is the key underpinning technology that will enable more complex forms of AI that can be autonomous in their responses to input. There have been obvious failings of this technology (the unfiltered Microsoft chatbot "Tay" as a prime example), but the application of properly developed and managed artificial systems for interaction is an important step along the route to full AI. There are so many repetitive tasks involved in any scientific or research project that using robotic intelligence engines to manage and perfect the more complex and repetitive tasks would greatly increase the speed at which new breakthroughs could be uncovered. Learning from repetition, improving patterns, and developing new processes is well within reach of current AI models, and will strengthen in the coming years as advances in artificial intelligence -- specifically machine learning and neural networks -- continue.
I was at a family event recently and two guests were chatting about the Artificial Intelligence (AI) component of driverless or autonomous vehicles and more specifically, how these vehicles are currently unable to detect human movement at high speed. Form left to right, the diagram depicts a Facebook messenger bot or just as easily an independent purpose build bot placed on the homepage of a website and with the purpose of connecting with customers in a customer service capacity providing and AI-like customer service experience or response to customer enquiries. The NLP component allows the computer to interpret the vast and complicated human language, understand what's being said, process it all, reflect what is being required of it and effectively'talk back', equally like humans do. Cognitive-based systems build knowledge and learn, understand natural language, and reason and interact more naturally with human beings than traditional programmable systems.
Over the past few years, machine learning and AI have pushed forward the capacity of computers to recognize images, understand context, and make decisions. A report from IHS Technology expects that the number of AI systems in vehicles will jump from 7 million in 2015 to 122 million by 2025, bringing new opportunities to enhance the capabilities of connected cars as more data becomes available. In addition, AI will push advanced driver assistance systems (ADAS) into the mainstream. For that, they need AI, which is what enables the camera-based machine vision systems, radar-based detection units, driver condition evaluation and sensor fusion engine control units (ECU) that make autonomous vehicles work.