Goto

Collaborating Authors

Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models

arXiv.org Machine Learning

Deep learning (DL) can achieve impressive results across a wide variety of tasks, but this often comes at the cost of training models for extensive periods on specialized hardware accelerators. This energy-intensive workload has seen immense growth in recent years. Machine learning (ML) may become a significant contributor to climate change if this exponential trend continues. If practitioners are aware of their energy and carbon footprint, then they may actively take steps to reduce it whenever possible. In this work, we present Carbontracker, a tool for tracking and predicting the energy and carbon footprint of training DL models. We propose that energy and carbon footprint of model development and training is reported alongside performance metrics using tools like Carbontracker. We hope this will promote responsible computing in ML and encourage research into energy-efficient deep neural networks.


The Imperative for Sustainable AI Systems

#artificialintelligence

This piece was the winner of the inaugural Gradient Prize. AI systems are compute-intensive: the AI lifecycle often requires long-running training jobs, hyperparameter searches, inference jobs, and other costly computations. They also require massive amounts of data that might be moved over the wire, and require specialized hardware to operate effectively, especially large-scale AI systems. All of these activities require electricity -- which has a carbon cost. There are also carbon emissions in ancillary needs like hardware and datacenter cooling [1]. Thus, AI systems have a massive carbon footprint[2].


The Imperative for Sustainable AI Systems

#artificialintelligence

AI systems are compute-intensive: the AI lifecycle often requires long-running training jobs, hyperparameter searches, inference jobs, and other costly computations. They also require massive amounts of data that might be moved over the wire, and require specialized hardware to operate effectively, especially large-scale AI systems. All of these activities require electricity -- which has a carbon cost. There are also carbon emissions in ancillary needs like hardware and datacenter cooling [1]. Thus, AI systems have a massive carbon footprint[2]. This carbon footprint also has consequences in terms of social justice as we will explore in this article.


Energy Usage Reports: Environmental awareness as part of algorithmic accountability

arXiv.org Machine Learning

The carbon footprint of algorithms must be measured and transparently reported so computer scientists can take an honest and active role in environmental sustainability. In this paper, we take analyses usually applied at the industrial level and make them accessible for individual computer science researchers with an easy-to-use Python package. Localizing to the energy mixture of the electrical power grid, we make the conversion from energy usage to CO2 emissions, in addition to contextualizing these results with more human-understandable benchmarks such as automobile miles driven. We also include comparisons with energy mixtures employed in electrical grids around the world. We propose including these automatically-generated Energy Usage Reports as part of standard algorithmic accountability practices, and demonstrate the use of these reports as part of model-choice in a machine learning context.


How Having Bigger AI Models Can Have A Detrimental Impact On Environment

#artificialintelligence

The COVID crisis has skyrocketed the applications of artificial intelligence -- from tackling this global pandemic, to being a vital tool in managing various business processes. Despite its benefits, AI has always been scrutinised for its ethical concerns like existing biases and privacy issues. However, this technology also has some significant sustainability issues – it is known to consume a massive amount of energy, creating a negative impact on the environment. As AI technology is getting advanced in predicting weather, understanding human speech, enhancing banking payments, and revolutionising healthcare, the advanced models are not only required to be trained on large datasets, but also require massive computing power to improve its accuracy. Such heavy computing and processing consumes a tremendous amount of energy and emits carbon dioxide, which has become an environmental concern. According to a report, it has been estimated that the power required for training AI models emits approximately 626,000 pounds (284 tonnes) of carbon dioxide, which is comparatively five times the lifetime emissions of the average US car.