Goto

Collaborating Authors

AMD chases the AI trend with its Radeon Instinct GPUs for machine learning

PCWorld

With the Radeon Instinct line, AMD joins Nvidia and Intel in the race to put its chips into AI applications--specifically, machine learning for everything from self-driving cars to art. The company plans to launch three products under the new brand in 2017, which include chips from all three of its GPU families. The passively cooled Radeon Instinct MI6 will be based on the company's Polaris architecture. It will offer 5.7 teraflops of performance and 224GBps of memory bandwidth, and will consume up to 150 watts of power. The small-form-factor, Fiji-based Radeon Instinct MI8 will provide 8.2 teraflops of performance and 512GBps of memory bandwidth, and will consume up to 175 watts of power.


Nervana Enhances Intel Machine Learning & Artificial Intelligence Portfolio

#artificialintelligence

With the Intel acquisition of artificial intelligence startup Nervana Machine learning is into mainstream focus for pushing the boundaries of technology. No doubt the deal was timed to get some of the luster from NVIDIA and their stellar earnings. Nervana follows last years buy of Altera as Intel actively expands from reliance on its core CPU base. We will continue to invest in leading edge technologies that complement and enhance Intel s AI portfolio. This fits well with Altera's field programmable gate arrays (FPGAs) and programmable logic devices (PLDs).


Despite the hype, nobody is beating Nvidia in AI

#artificialintelligence

You have to wonder whether Nvidia is going to get sick of winning all the time. The company's stock price is up to $178--69% more than this time last year. Nvidia is riding high on its core technology, the graphics processing unit used in the machine-learning that powers the algorithms of Facebook and Google; partnerships with nearly every company keen on building self-driving cars; and freshly announced hardware deals with three of China's biggest internet companies. Investors say this isn't even the top for Nvidia: William Stein at SunTrust Robinson Humphrey predicts Nvidia's revenue from selling server-grade GPUs to internet companies, which doubled last year, will continue to increase 61% annually until 2020. Nvidia will likely see competition in the near future.


Intel, Nervana Shed Light on Deep Learning Chip Architecture

#artificialintelligence

Almost two years after the acquisition by Intel, the deep learning chip architecture from startup Nervana Systems will finally be moving from its codenamed "Lake Crest" status to an actual product. In that time, Nvidia, which owns the deep learning training market by a long shot, has had time to firm up its commitment to this expanding (if not overhyped in terms of overall industry dollar figures) market with new deep learning-tuned GPUs and appliances on the horizon as well as software tweaks to make training at scale more robust. In other words, even with solid technology at a reasonable price point, for Intel to bring Nervana to the fore of the training market–and push its other products for inference at scale along with that current, it will take a herculean effort–one that Intel seems willing to invest in given its aggressive roadmap for the Nervana-based lineup. The difference now is that at least we have some insight into how (and by how much) this architecture differs from GPUs–and where it might carve out a performance advantage and more certainly, a power efficiency one. The Nervana Intel chip will be very similar to the first generation of chips Nervana was set to bring to market pre-acquisition but with the added benefit of more expertise and technology from Intel feeding developments that put the deep learning chip on a yearly cadence schedule, according to Nervana's first non-founder employee four years ago and now head of AI hardware within Intel, Carey Kloss.


The Next Wave of Deep Learning Architectures

#artificialintelligence

Intel has planted some solid stakes in the ground for the future of deep learning over the last month with its acquisition of deep learning chip startup, Nervana Systems, and most recently, mobile and embedded machine learning company, Movidius. These new pieces will snap into Intel's still-forming puzzle for capturing the supposed billion-plus dollar market ahead for deep learning, which is complemented by its own Knights Mill effort and software optimization work on machine learning codes and tooling. At the same time, just down the coast, Nvidia is firming up the market for its own GPU training and inference chips as well as its own hardware outfitted with the latest Pascal GPUs and requisite deep learning libraries. While Intel's efforts have garnered significant headlines recently with that surprising pair of acquisitions, a move which is pushing Nvidia harder to demonstrate GPU acceleration (thus far the dominant compute engine for model training) for deep learning, they still have some work to do to capture mindshare for this emerging market. Further complicating this is the fact that the last two years have brought a number of newcomers to the field--deep learning chip upstarts touting the idea that general purpose architectures (including GPUs) cannot compare to a low precision, fixed point, specialized approach.