Results


Intel Proposes Its Embedded Processor Graphics For Real-Time Artificial Intelligence

#artificialintelligence

Further research told me that along with FPGA (Field Programmable Field Gate Array), there's an embedded Intel Processor Graphics for deep learning inference. Unlike the Project BrainWave of Microsoft (which only relies on Altera's Stratix 10 FPGA to accelerate deep learning inference), Intel's Inference Engine design uses integrated GPUs alongside FPGAs. However, embedded Intel's Processor Graphics and Altera's Stratix 10 FPGA could be the top hardware products for deep learning inference accelerations. Marketing its embedded graphics processors to accelerate deep learning/artificial intelligence computing is one more reason for us to stay long INTC.


Intel unveils AI-focused Movidius VPU chip

ZDNet

Intel on Monday announced its next-generation Movidius vision processing unit with improved processing capabilities for edge devices such as drones, VR headsets, smart cameras, wearables, and robots. Its latest VPU is the Myriad X system-on-chip that's equipped with a dedicated Neural Compute Engine to support deep learning inferences at the edge. The Movidius Myriad 2 Vision Processing Unit will be used to run deep neural networks for higher accuracy, local video analytics. Intel's Movidius launches AI accelerator on a $79 USB stick The Movidius Neural Compute Stick compiles, tunes and accelerates neural networks at the edge.


Nvidia's Next Big Thing: The HGX-1 AI Platform

#artificialintelligence

In an article published in December, last year, I said Nvidia's stock could scale new highs if the company's revenue continues to grow at a CAGR of 20% plus in the foreseeable future. Investors need to understand how the advantages of TensorFlow coupled with Nvidia's HGX-1 reference architecture will boost its revenue in the foreseeable future, which Nvidia's stock at today's price around $150 hasn't factored in. While Intel (NASDAQ:INTC) was busy to make its MKL (math kernel library) more versatile via incorporating the Neon deep learning framework of Nervana, the startup which Intel acquired almost a year ago, Nvidia silently made its CUDA parallel computing software platform compatible with TensorFlow. AMD, with its upcoming Radeon Instinct line of GPUs along with its upcoming Zen-based Naples CPUs (bundled with the SenseMI technology), could be Nvidia's real competitor, but for that to happen the OpenCL (open computing language) software library (which is the CUDA equivalent for AMD, and to some extent Intel) needs to support TensorFlow.


The Race Is On For The AI Chip

#artificialintelligence

Artificial Intelligence ('AI') is finally taking off, helped by big data cloud computing, and breakthroughs in neural networks (computer code that emulates large networks of very simple interconnected units, a bit like neurons in the brain) and deep learning (how we sharpen AI by structuring neural networks in multiple processing layers.) Of course, Nvidia itself isn't sitting still, it has the Xavier System on Chip, integrating CPU, CUDA GPU and deep learning accelerators for the forthcoming Drive PX3 (autonomous driving). Xavier has upgraded its GPU core from Pascal to CUDA Volta, significantly reducing energy costs (just 20W). Although Pascal has performed well in deep learning, Volta is far superior because it unifies CUDA Cores and Tensor Cores.


Intel India Eyes AI Opportunities, Plans to Develop Ecosystem

#artificialintelligence

Intel's collaboration with companies such as Google, and the company's acquisition, including Saffron, Movidius, Nervana Systems and Mobileye, further Intel's AI capabilities, giving the company an edge, especially at a time when embedded computer vision is becoming increasingly important the world over. AI at Work AI is the combination of various fields such as machine learning, natural language processing, computer vision, reasoning systems, neural networks, deep learning, depth sensing, programmable systems, parallel computing and more. Intel's AI Ecosystem Initiatives Additionally, to engage students, researchers and developers, Intel India announced a comprehensive developer community initiative – the AI Developer Education Program, targeted at educating 15,000 scientists, developers, analysts, and engineers on key AI technologies, including Deep Learning and Machine Learning. IIT Patna has been doing cutting edge research and development in Artificial Intelligence, distributed computing, network security, social networks, and beyond, using data driven machine learning, as well as knowledge and deep learning based methods.


Is machine learning the next commodity?

#artificialintelligence

Driving this surge of machine-learning development is a wave of data generated by mobile phones, sensors, and video cameras. As a result, we expect machine learning will become the next great commodity. Released as a free, open-source operating system in 1991, it now powers nearly all the world's supercomputers, most of the servers behind the Internet, and the majority of financial trades worldwide – not to mention tens of millions of Android mobile phones and consumer devices. A director at Intel Capital, Sanjit Dang drives investments in user computing across the consumer and enterprise sectors.


Nvidia Up Another 5%: Bulls Delighted Pondering 'Epic Disruption'

#artificialintelligence

B. Riley's Craig Ellis reiterates a Buy rating, and a $135 price target, writing that Nvidia "seems well positioned to stay at A.I. Vijay Rakesh of Mizuho reiterates his Buy rating and raises his price target to $145 from $130, writing that the new market size the company mentioned for data center chips is $30 billion by 2020, which is two-and-a-half times the current size. Another tidbit we thought intriguing is that Nvidia intends to open source Xavier (give away source code freely in order to accelerate adoption of deep learning). Adds Wong, "Nvidia believes that its datacenter GPU TAM might rise to about $30 billion (HPC deep learning training and inference).


The rise of AI marks an end to CPU dominated computing

#artificialintelligence

Digitization means lots of data, and making sense of lots of data increasingly looks like either an AI problem or an HPC problem (which are really the same in a lot of ways). A fast-growing lucrative market means competition, of course, and a raft of new AI chips is around the corner. Other AI chip efforts include Mobileye (the company is being bought by Intel), Graphcore, BrainChip, TeraDeep, KnuEdge, Wave Computing, and Horizon Robotics. From automobiles to laptops to desktop to gaming to HPC to AI, the company manages to increase its Total Available Market (TAM) with minimal duplication of effort.


The rise of AI marks an end to CPU dominated computing

#artificialintelligence

Digitization means lots of data, and making sense of lots of data increasingly looks like either an AI problem or an HPC problem (which are really the same in a lot of ways). A fast-growing lucrative market means competition, of course, and a raft of new AI chips is around the corner. Other AI chip efforts include Mobileye (the company is being bought by Intel), Graphcore, BrainChip, TeraDeep, KnuEdge, Wave Computing, and Horizon Robotics. From automobiles to laptops to desktop to gaming to HPC to AI, the company manages to increase its Total Available Market (TAM) with minimal duplication of effort.


To democratise artificial intelligence, Intel launches educational programme for developers

#artificialintelligence

Reiterating its commitment to boost adoption of artificial intelligence (AI), Intel India today announced a developer community initiative – AI Developer Education Programme, aimed at educating 15,000 scientists, developers, analysts, and engineers. The educational programme is also aimed at deep learning and machine learning, the tech major said in a statement. The programme was announced at the first AI Day held in Bengaluru where thought-leaders from government, industry, and the academia congregated and discussed the potential of accelerating the AI revolution in the country. Under the programme, Intel will run 60 programmes across the year, ranging from workshops, roadshows, user group and senior technology leader round-tables. Announcing the programme, Intel South Asia managing director Prakash Mallya said data center and the intelligence behind the data collected can enable government and industry to make effective decisions based on algorithms.