Why 2017 is setting up to be the year of GPU chips in deep learning

#artificialintelligence 

GPU technology has been around for decades, but only recently has it gained traction among enterprises. It was traditionally used to enhance computer graphics, as the name suggests. But as deep learning and artificial intelligence have grown in prominence, the need for fast, parallel computation to train models has increased. "A couple years ago, we wouldn't be looking at special hardware for this," said Adrian Bowles, founder of analyst firm STORM Insights Inc. in Boston. "But with [deep learning], you have a lot of parallel activities going on, and GPU-based tools are going to give you more cores."

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found