James Hodson, CEO AI for Good Foundation, Zero Footprint AI - AI for Good Foundation

#artificialintelligence 

In the past decade, the Machine Learning community has achieved breakthrough improvements on a variety of inference, prediction, and control tasks. Primarily, these improvements have been facilitated by an explosion in computational power and large clusters of machines working efficiently together. The cost of a 10% reduction in model error rate can often translate into a 1,000 fold increase in model size, and several orders of magnitude more energy being expended in training and running these eventual models. As data become ever more plentiful, and data scientists rely more and more on large state-of-the-art modelling algorithms, the question of the efficiency of learning per Watt of expended energy–and how we compare the ultimate utility of relative improvements in model accuracy–becomes ever more salient. The question of the sustainability of running large-scale computations and ML applications has also gained traction since conservative estimates of the AlphaGo winning model against Lee Sedol placed it at around 1MW of power consumption just during the match.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found