Goto

Collaborating Authors

Results


New AI Chips Set to Reshape Data Centers - EE Times India

#artificialintelligence

AI chip startups are hot on the heels of GPU leader Nvidia. At the same time, there is also significant competition in data center inference... New computing models such as machine learning and quantum are becoming more important for delivering cloud services. The most immediate computing change has been the rapid adoption of ML/AI for consumer and business applications. This new model requires the processing vast amounts of data to developing usable information, and eventually building knowledge models. These models are rapidly growing in complexity – doubling every 3.5 months.


CPUs vs GPUs: Which chips will give firms the AI edge?

#artificialintelligence

Mumbai: Early this month at the Intel AI Devcon 2018 in Bengaluru, a holographic avatar called Ella listened intently to composer Kevin Doucette playing notes on his synthesizer. When he paused, she began composing her own notes, complementing his music in real-time. Ella was learning about features such as tempo, scale and pitch from the music data that was being sent in real-time to an Intel Movidius Neural Compute Stick. Intel used a class of artificial neural networks, the recurrent neural network or RNN that depends on previous calculations to work on current ones, to perform this artificial intelligence (AI) task. This Neural Compute Stick is simply a case in point that Intel--a company which most people identify with central processing units (CPUs) inside personal computers (PCs), mobiles and servers--is widening its portfolio to stay in the AI race that has strong contenders including Nvidia, Microsoft, Google, Facebook, IBM, Amazon, Apple, Alibaba and Baidu.


How Nvidia is surfing the AI wave

#artificialintelligence

San Francisco: Jensen Huang, co-founder, president and chief executive officer of Santa Clara-based Nvidia Corp., says that the rapid adoption of artificial intelligence (AI) technologies such as machine learning, deep learning, natural language processing and computer vision augur well for the growth prospects of his company. His confidence stems from the fact that Nvidia designs the chips that can deliver the extra computing power that clients need in an algorithm-driven world, which is increasingly using these AI technologies to make business sense of the voluminous data that users generate and thus gain a competitive edge. These chips, called graphics processing units (GPUs), helped Nvidia fuel the growth of the personal computer gaming market almost two decades back. Huang hopes the increasing use of GPUs for AI will help his company repeat the success. Huang argues that even when you increase the number of central processing unit transistors in a computer, they result in a small increase in application performance, whereas GPUs, which are specifically designed to handle multiple tasks simultaneously, make them more suitable for high-performance computing tasks.