MIT researchers warn that deep learning is approaching computational limits

#artificialintelligence 

That's according to researchers at the Massachusetts Institute of Technology, Underwood International College, and the University of Brasilia, who found in a recent study that progress in deep learning has been "strongly reliant" on increases in compute. It's their assertion that continued progress will require "dramatically" more computationally efficient deep learning methods, either through changes to existing techniques or via new as-yet-undiscovered methods. "We show deep learning is not computationally expensive by accident, but by design. The same flexibility that makes it excellent at modeling diverse phenomena and outperforming expert models also makes it dramatically more computationally expensive," the coauthors wrote. "Despite this, we find that the actual computational burden of deep learning models is scaling more rapidly than (known) lower bounds from theory, suggesting that substantial improvements might be possible."

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found