From smartphone assistants to image recognition and translation, machine learning already helps us in our everyday lives. But it can also help us to tackle some of the world's most challenging physical problems -- such as energy consumption. Large-scale commercial and industrial systems like data centres consume a lot of energy, and while much has been done to stem the growth of energy use, there remains a lot more to do given the world's increasing need for computing power. Reducing energy usage has been a major focus for us over the past 10 years: we have built our own super-efficient servers at Google, invented more efficient ways to cool our data centres and invested heavily in green energy sources, with the goal of being powered 100 percent by renewable energy. Compared to five years ago, we now get around 3.5 times the computing power out of the same amount of energy, and we continue to make many improvements each year.
Wind power has become increasingly popular, but its success is limited by the fact that wind comes and goes as it pleases, making it hard for power grids to count on the renewable energy and less likely to fully embrace it. While we can't control the wind, Google has an idea for the next best thing: using machine learning to predict it. Google and DeepMind have started testing machine learning on Google's own wind turbines, which are part of the company's renewable energy projects. Beginning last year, they fed weather forecasts and existing turbine data into DeepMind's machine learning platform, which churned out wind power predictions 36 hours ahead of actual power generation.
Tech giant Google has taken a "phenomenal step forward" in its efforts to drive energy efficiency, after developing artificial intelligence (AI) that has reduced energy consumption at its data centres by 40%. Google has been able to achieve 3.5 times more computing power from the same amount of energy compared to five years ago Google's data centres, although powered by renewables, still consume vast amounts of energy during cooling processes. Over the past 10 years, Google has developed the AI system using the'DeepMind' research company to live test a system of neural networks - computer systems modelled on the human brain - that have led to a more efficient and adaptive framework for data centre management. DeepMind has managed to train these neural networks to predict the temperature and pressure outputs within the centres, 60 minutes in advance before establishing the appropriate requirements to lower output and energy consumption. The system not only delivered 40% cuts to energy consumption, but also reduced Power Usage Effectiveness (PUE) – the ratio of total building energy use to IT energy use – by as much as 15%.
For all of its power and promise, artificial intelligence has some big drawbacks -- its massive carbon footprint is one of them. Training a'regular' AI using a single high-performance graphics card produces the same amount of carbon as a flight across the United States, according to MIT Technology Review. That's because AI requires so much data. All of it must be captured, stored, analyzed, and sent out, and this requires vast amounts of processing power. Data centers require more servers, larger footprints, and cooling.
To further enhance its research capabilities Eco Marine Power announced today that it will begin using the Neural Network Console provided by Sony Network Communications Inc., as part of a strategy to incorporate Artificial Intelligence (AI) into various ongoing ship related technology projects including the further development of the patented Aquarius MRE (Marine Renewable Energy) and EnergySail.