Goto

Collaborating Authors

DeepMind AI Reduces Google Data Centre Cooling Bill by 40% DeepMind

#artificialintelligence

From smartphone assistants to image recognition and translation, machine learning already helps us in our everyday lives. But it can also help us to tackle some of the world's most challenging physical problems -- such as energy consumption. Large-scale commercial and industrial systems like data centres consume a lot of energy, and while much has been done to stem the growth of energy use, there remains a lot more to do given the world's increasing need for computing power. Reducing energy usage has been a major focus for us over the past 10 years: we have built our own super-efficient servers at Google, invented more efficient ways to cool our data centres and invested heavily in green energy sources, with the goal of being powered 100 percent by renewable energy. Compared to five years ago, we now get around 3.5 times the computing power out of the same amount of energy, and we continue to make many improvements each year.


Predicting Energy Production

#artificialintelligence

As created for AI4IMPACT's Deep Learning Datathon 2020, TEAM DEFAULT has created a neural-network-based deep learning model used for predicting energy production demand in France. The model was created using Smojo, on AI4IMPACT's innovative cloud-based learning and model deployment system. Our model was able to achieve a 0.131 test loss which beat persistence loss of 0.485 by a quite a fair margin. As the energy market becomes increasingly liberalized across the world, the free and open market has seen an uptick and importance for optimized energy demand. New and existing entrants turn to data and various methods to forecast energy consumption in hopes of turning over a profit.


Scientists develop energy-efficient AI processor - ET Telecom

#artificialintelligence

SEOUL: A group of South Korean scientist have developed an Artificial Intelligence (AI) -based processor that can be utilised for various deep learning technologies. According to the Ministry of Science and ICT, the team led by Yoo Hoi-jun of the Korea Advanced Institute of Science and Technology (KAIST) developed the neural network recognition processor that is more energy efficient compared with those in use today, Yonhap News Agency reported on Monday. Noticeably, the team's processor delivered fourfold higher efficiency compared with Google Inc.'s Tensor Processing Unit (TPU), considered a front runner in neural network computations. The new processor can also deliver both convolutional and recurrent neural networks simultaneously. "It is meaningful that the research team developed the processor that can be activated at low power to realise the AI technology," said Yoo, who is a professor of electrical engineering.


E2-Train: Energy-Efficient Deep Network Training with Data-, Model-, and Algorithm-Level Saving

arXiv.org Machine Learning

Convolutional neural networks (CNNs) have been increasingly deployed to edge devices. Hence, many efforts have been made towards efficient CNN inference on resource-constrained platforms. This paper attempts to explore an orthogonal direction: how to conduct more energy-efficient training of CNNs, so as to enable on-device training? We strive to reduce the energy cost during training, by dropping unnecessary computations, from three complementary levels: stochastic mini-batch dropping on the data level; selective layer update on the model level; and sign prediction for low-cost, low-precision back-propagation, on the algorithm level. Extensive simulations and ablation studies, with real energy measurements from an FPGA board, confirm the superiority of our proposed strategies and demonstrate remarkable energy savings for training. For example, when training ResNet-74 on CIFAR-10, we achieve aggressive energy savings of >90% and >60%, while incurring a top-1 accuracy loss of only about 2% and 1.2%, respectively. When training ResNet-110 on CIFAR-100, an over 84% training energy saving is achieved without degrading inference accuracy.


Machine learning can boost the value of wind energy

#artificialintelligence

Carbon-free technologies like renewable energy help combat climate change, but many of them have not reached their full potential. Consider wind power: over the past decade, wind farms have become an important source of carbon-free electricity as the cost of turbines has plummeted and adoption has surged. However, the variable nature of wind itself makes it an unpredictable energy source--less useful than one that can reliably deliver power at a set time. In search of a solution to this problem, last year, DeepMind and Google started applying machine learning algorithms to 700 megawatts of wind power capacity in the central United States. These wind farms--part of Google's global fleet of renewable energy projects--collectively generate as much electricity as is needed by a medium-sized city.