Goto

Collaborating Authors

Energy Policy



Knauf Insulation promises to refine EPC measurement with new machine-learning tool – IAM Network

#artificialintelligence

The new technology provides'actual' thermal fabric performance data Knauf Insulation EXCLUSIVE: System that uses machine learning to measure the'actual' fabric thermal performance of a home within three months could provide evidence base for national retrofit programme Knauf Insulation has developed a technology that uses machine learning systems to accurately measure the'actual' energy performance of individual homes, an innovation that could drastically enhance the accuracy of energy performance certificates (EPC), BusinessGreen can reveal. The new technology, which can generate an assessment of fabric performance of a home within three months, could provide the evidence base for an energy efficiency retrofit programme for the nation's homes, the company said. The "discreet, scalable and cost-efficient" measurement tool ensures the building fabric component of a building's EPC rating can be backed by real evidence, rather than "notional Standard Assessment Procedure calculations", according to Knauf Insulation. The company stressed the tool marked a major departure from other available techniques to measure'actual' fabric thermal performance, which it said were "intrusive and expensive". Steven Heath, technical and strategy director of Knauf Insulation, celebrated the launch of the product, noting that the UK's 2050 net zero emissions ambition depended on the country's housing …


55

#artificialintelligence

Can artificial intelligence be deployed to slow down global warming, or is AI one of the greatest climate sinners ever? That is the interesting debate that finds (not surprisingly) representatives from the AI industry and academia on opposite sides of the issue. While PwC and Microsoft published a report concluding that using AI could reduce world-wide greenhouse gas emissions by 4% in 2030, researchers from the University of Amherst Massachusetts have calculated that training a single AI model can emit more than 626,000 pounds of carbon dioxide equivalent--nearly five times the lifetime emissions of the average American car. The big players have clearly understood that the public sensibility towards climate change offers a wonderful marketing opportunity. IBM has launched its Green Horizons project to analyze environmental data and predict pollution.


Energy-saving designs for data-intensive computer processing

AIHub

Researchers have demonstrated methods for both designing innovative data-centric computing hardware and co-designing hardware with machine-learning algorithms that together could improve energy efficiency by as much as two orders of magnitude. Advances in machine learning have ushered in a new era of computing -- the data-centric era -- and are forcing engineers to rethink aspects of computing architecture that have gone mostly unchallenged for 75 years. "The problem is that for large-scale deep neural networks, which are state-of-the-art for machine learning today, more than 90% of the electricity needed to run the entire system is consumed in moving data between the memory and processor," said Yingyan Lin, an assistant professor of electrical and computer engineering. Lin and collaborators proposed two complementary methods for optimizing data-centric processing, both of which were presented at the International Symposium on Computer Architecture (ISCA), a conference for new ideas and research in computer architecture. The drive for data-centric architecture is related to a problem called the von Neumann bottleneck, an inefficiency that stems from the separation of memory and processing in the computing architecture that has reigned supreme since mathematician John von Neumann developed it in 1945.


Deep Learning's Climate Change Problem

#artificialintelligence

The human brain is an incredibly efficient source of intelligence. Earlier this month, OpenAI announced it had built the biggest AI model in history. This astonishingly large model, known as GPT-3, is an impressive technical achievement. Yet it highlights a troubling and harmful trend in the field of artificial intelligence--one that has not gotten enough mainstream attention. Modern AI models consume a massive amount of energy, and these energy requirements are growing at a breathtaking rate.


How Having Bigger AI Models Can Have A Detrimental Impact On Environment

#artificialintelligence

The COVID crisis has skyrocketed the applications of artificial intelligence -- from tackling this global pandemic, to being a vital tool in managing various business processes. Despite its benefits, AI has always been scrutinised for its ethical concerns like existing biases and privacy issues. However, this technology also has some significant sustainability issues – it is known to consume a massive amount of energy, creating a negative impact on the environment. As AI technology is getting advanced in predicting weather, understanding human speech, enhancing banking payments, and revolutionising healthcare, the advanced models are not only required to be trained on large datasets, but also require massive computing power to improve its accuracy. Such heavy computing and processing consumes a tremendous amount of energy and emits carbon dioxide, which has become an environmental concern. According to a report, it has been estimated that the power required for training AI models emits approximately 626,000 pounds (284 tonnes) of carbon dioxide, which is comparatively five times the lifetime emissions of the average US car.


Machine Learning Can Now Solve Climate Change - Here's How

#artificialintelligence

All of us use heating and cooling systems in our homes and offices on a day to day basis without knowing the impact. They account for almost half the average residential energy use and demand a lot of energy usage as well. Nest has come up with a smart thermostat known as Nest's Learning Thermostat. By using this, we'll be able to save money and the planet too. Smart thermostats basically are able to automatically adjust your indoor temperature settings based on the data of external humidity.


Deep Learning's Climate Change Problem

#artificialintelligence

The human brain is an incredibly efficient source of intelligence. Earlier this month, OpenAI announced it had built the biggest AI model in history. This astonishingly large model, known as GPT-3, is an impressive technical achievement. Yet it highlights a troubling and harmful trend in the field of artificial intelligence--one that has not gotten enough mainstream attention. Modern AI models consume a massive amount of energy, and these energy requirements are growing at a breathtaking rate.


(PDF) Can Your AI Differentiate Cats from Covid-19? Sample Efficient Uncertainty Estimation for Deep Learning Safety

#artificialintelligence

Climate change impact studies are subject to numerous uncertainties and assumptions. One of the main sources of uncertainty arises from the interpretation of climate model projections. Probabilistic procedures based on multimodel ensembles have been suggested in the literature to quantify this source of uncertainty. However, the interpretation of multimodel ensembles remains challenging. Several ... [Show full abstract] assumptions are often required in the uncertainty quantification of climate model projections.


Giant larvacean could help the battle against climate change

Daily Mail - Science & tech

A strange sea creature that lives 1,000 feet below the surface encased in a giant bubble of mucus may be key to removing carbon dioxide from the atmosphere. These bubble-houses are discarded and replaced regularly as the animal grows in size and its filters become clogged with particles. Once discarded, they sink to the seafloor and encapsulate the carbon for good, preventing it from re-entering the atmosphere. Larvaceans also capture and dispose of microplastics in this way, which can come from clothing and cosmetics and often ingested by other marine species. Researchers used a system of lasers mounted on a 12,000 pound robot to map the giant larvacean's delicate body in a series of 3D images.