If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
There is no denying the fact that Artificial Intelligence (AI) is one phenomenon that has stood out among other emerging technologies. Sensing great possibilities, global chip giant Intel has now joined the AI bandwagon in a big way. AI is not new to the world of technology but the past five years have given AI believers a reason to cheer as its uses are increasing across industries – from health care to autonomous vehicles – say AI experts at Intel. "AI capabilities are greatly supplementing humans to do great work in less time in sectors like healthcare, banking and finance, transport, energy and robotics, etc. It will be interesting to see how this whole AI thing evolves with time," Bob Rogers, Data Scientist, AI and Analytics, Data Center Group at Intel, told IANS here.
In a blog post today, Intel (NASDAQ:INTC) CEO Brian Krzanich announced the Nervana Neural Network Processor (NNP). The Intel Nervana NNP promises to revolutionize AI computing across myriad industries. Using Intel Nervana technology, companies will be able to develop entirely new classes of AI applications that maximize the amount of data processed and enable customers to find greater insights – transforming their businesses... We have multiple generations of Intel Nervana NNP products in the pipeline that will deliver higher performance and enable new levels of scalability for AI models. This puts us on track to exceed the goal we set last year of achieving 100 times greater AI performance by 2020.
Imagine a future where complex decisions could be made faster and adapt over time. Where societal and industrial problems can be autonomously solved using learned experiences. It's a future where first responders using image-recognition applications can analyze streetlight camera images and quickly solve missing or abducted person reports. It's a future where stoplights automatically adjust their timing to sync with the flow of traffic, reducing gridlock and optimizing starts and stops. It's a future where robots are more autonomous and performance efficiency is dramatically increased.
Lots of tech companies including Apple, Google, Microsoft, NVIDIA and Intel itself have created chips for image recognition and other deep-learning chores. However, Intel is taking another tack as well with an experimental chip called "Loihi." Rather than relying on raw computing horsepower, it uses an old-school, as-yet-unproven type of "nueromorphic" tech that's modeled after the human brain. Intel has been exploring neuromorphic tech for awhile, and even designed a chip in 2012. Instead of logic gates, it uses "spiking neurons" as a fundamental computing unit.
I was wrong to say that Intel (INTC) doesn't need GPUs to compete with Nvidia (NVDA) on artificial intelligence/deep learning computing. Further research told me that along with FPGA (Field Programmable Field Gate Array), there's an embedded Intel Processor Graphics for deep learning inference. It's a new concept that was discussed by Intel only last May. Nvidia's GPU can be the Training Engine for deep learning computers. Intel's FPGAs and embedded Processor Graphics could be the go-to hardware accelerators for inference computing.
Artificial Intelligence (AI) is the cutting edge in technology. It is fast getting mainstream and coming out of the confines of science fiction. The onset of AI-based technology in India is evident in the sectors of e-commerce and research, where entities that are already using data analytics, are now looking to explore AI. I got great perspectives on the potential of AI at Intel's AI Day at Bangalore recently. I also got to know the various parts that make up AI and why it is so complex.
Global chip maker Intel on Tuesday announced a string of initiatives to boost the usage of Artificial Intelligence (AI) in diverse sectors by collaborating with partners and customers across the country. "Our developer education programme will educate 15,000 scientists, developers, analysts and engineers on AI technologies, including Deep Learning and Machine Learning in India," said Intel South Asia Managing Director Praksh Mallya here. AI is a software programme that makes computers and machines think intelligently and faster with more predictability than a human mind. AI is also the main workload in data centres which operate in line with the Moore's Law of computing power doubling every year. By 2020, the industry expects more servers to process data analytics than other workloads and analytics predictors will be built into every application.
Intel announced late last week that it has formed a new AI group to consolidate a number of its programs and acquisitions. It's headed by Naveen Rao, the former head of Intel acquisition Nervana. This means Intel is making sure is has a major seat at the table as artificial intelligence and machine learning branch out to touch virtually everything -- from autonomous driving to IoT to enhancing corporate systems -- over the next 5-7 years. In the short term, the group will focus on research related to its software and hardware (Nervana, Xeon/Lakecrest chips and subsequent families) to deliver AI for drones and autonomous vehicles, smart cities, health care, personal appliances, etc. But I expect a longer-term play.
Like all hardware device makers eager to meet the newest market opportunity, Intel is placing multiple bets on the future of machine learning hardware. The chipmaker has already cast its Xeon Phi and future integrated Nervana Systems chips into the deep learning pool while touting regular Xeons to do the heavy lifting on the inference side. However, a recent conversation we had with Intel turned up a surprising new addition to the machine learning conversation--an emphasis on neuromorphic devices and what Intel is openly calling "cognitive computing" (a term used primarily--and heavily--for IBM's Watson-driven AI technologies). This is the first time to date we've heard the company make any definitive claims about where neuromorphic chips might fit into a strategy to capture machine learning, and marks a bold grab for the term "cognitive computing" which has been an umbrella term for Big Blue's AI business. Intel has been developing neuromorphic devices for some time, with one of the first prototypes that was well known in 2012.