The new technology provides'actual' thermal fabric performance data Knauf Insulation EXCLUSIVE: System that uses machine learning to measure the'actual' fabric thermal performance of a home within three months could provide evidence base for national retrofit programme Knauf Insulation has developed a technology that uses machine learning systems to accurately measure the'actual' energy performance of individual homes, an innovation that could drastically enhance the accuracy of energy performance certificates (EPC), BusinessGreen can reveal. The new technology, which can generate an assessment of fabric performance of a home within three months, could provide the evidence base for an energy efficiency retrofit programme for the nation's homes, the company said. The "discreet, scalable and cost-efficient" measurement tool ensures the building fabric component of a building's EPC rating can be backed by real evidence, rather than "notional Standard Assessment Procedure calculations", according to Knauf Insulation. The company stressed the tool marked a major departure from other available techniques to measure'actual' fabric thermal performance, which it said were "intrusive and expensive". Steven Heath, technical and strategy director of Knauf Insulation, celebrated the launch of the product, noting that the UK's 2050 net zero emissions ambition depended on the country's housing …
Can artificial intelligence be deployed to slow down global warming, or is AI one of the greatest climate sinners ever? That is the interesting debate that finds (not surprisingly) representatives from the AI industry and academia on opposite sides of the issue. While PwC and Microsoft published a report concluding that using AI could reduce world-wide greenhouse gas emissions by 4% in 2030, researchers from the University of Amherst Massachusetts have calculated that training a single AI model can emit more than 626,000 pounds of carbon dioxide equivalent--nearly five times the lifetime emissions of the average American car. The big players have clearly understood that the public sensibility towards climate change offers a wonderful marketing opportunity. IBM has launched its Green Horizons project to analyze environmental data and predict pollution.
Polymeric membranes assist with a wide variety of tasks, including water filtration and gas-vapor separation. Designing a membrane for the desired function is more time-consuming than people may expect. However, researchers at Columbia Engineering, Germany's Max Planck Society and the University of South Carolina applied data science to the task to streamline their efforts. More specifically, they combined big data with machine learning to strategically design polymer membranes to act as gas filters. People frequently depend on plastic films and membranes to separate mixtures of simple cases, such as carbon dioxide and methane.
Managing electrical energy consumption is crucial, simply because of one fact: Electricity cannot be stored, unless converted to other forms. It is best for produced electricity to be instantly consumed; otherwise, additional resources and costs are incurred to store convert and store the excess energy. Energy-efficient buildings provide both economic and environmental benefits, maximising profits and social welfare. Conversely, underestimating energy consumption could be fatal, with excess demand overloading the supply line and even causing blackouts, leading to operational downtime. Clearly, there are tangible benefits in closely monitoring the energy consumption of buildings -- be they office, commercial or household. With the advent of machine learning and data science, accurately predicting future energy consumption becomes increasingly possible. This provides two-fold benefits: firstly, managers gain key insights into factors affecting their building's energy demand, providing opportunities to address them and improve energy efficiency. Not only that, forecasts provide a benchmark to single out anomalously high/low energy consumption and alert managers to faults within the building.
Artificial intelligence has a terrible carbon footprint. Researchers at Stanford University, Facebook AI Research, and Canada's McGill University have developed a tool to measure the hidden cost of machine learning. The "experiment impact tracker" quantifies how much electricity a machine learning project will consume, and its cost in carbon emissions. The team first measured the energy cost of a specific artificial intelligence (AI) model--a challenge because a single machine often trains several models concurrently, while each session also draws power for shared overhead functions like data storage and cooling. The researchers then translated power consumption into carbon emissions, whose blend of renewable and fossil fuels varies by location and time of day, by tapping into public sources about this energy mix.
Researchers at the University of Pittsburgh, University of Massachusetts Amherst, and Microsoft Research India have developed a system -- WattScale -- that leverages AI to pick out the least energy-efficient buildings from a city- or region-level population. In a preprint study, they used it to show that half of the buildings in a 10,000-building data set were inefficient, in large part due to poor construction. They also emit over a third of the nation's greenhouse gases, which is more than any other sector of the economy. Solving for the disparity requires identifying buildings that are the least efficient and thus have the greatest need for improvements, but approaches that rely on the age of a building or its total energy bill don't work well; greater energy usage doesn't necessarily point to inefficiencies. WattScale aims to address this with (1) a Bayesian modeling technique that captures variable distributions governing the energy usage of a building and (2) a fault analysis algorithm that makes use of these distributions to report probable causes of inefficiency.
Researchers have demonstrated methods for both designing innovative data-centric computing hardware and co-designing hardware with machine-learning algorithms that together could improve energy efficiency by as much as two orders of magnitude. Advances in machine learning have ushered in a new era of computing -- the data-centric era -- and are forcing engineers to rethink aspects of computing architecture that have gone mostly unchallenged for 75 years. "The problem is that for large-scale deep neural networks, which are state-of-the-art for machine learning today, more than 90% of the electricity needed to run the entire system is consumed in moving data between the memory and processor," said Yingyan Lin, an assistant professor of electrical and computer engineering. Lin and collaborators proposed two complementary methods for optimizing data-centric processing, both of which were presented at the International Symposium on Computer Architecture (ISCA), a conference for new ideas and research in computer architecture. The drive for data-centric architecture is related to a problem called the von Neumann bottleneck, an inefficiency that stems from the separation of memory and processing in the computing architecture that has reigned supreme since mathematician John von Neumann developed it in 1945.
The human brain is an incredibly efficient source of intelligence. Earlier this month, OpenAI announced it had built the biggest AI model in history. This astonishingly large model, known as GPT-3, is an impressive technical achievement. Yet it highlights a troubling and harmful trend in the field of artificial intelligence--one that has not gotten enough mainstream attention. Modern AI models consume a massive amount of energy, and these energy requirements are growing at a breathtaking rate.
For all the advances enabled by artificial intelligence, from speech recognition to self-driving cars, AI systems consume a lot of power and can generate high volumes of climate-changing carbon emissions. A study last year found that training an off-the-shelf AI language-processing system produced 1,400 pounds of emissions--about the amount produced by flying one person roundtrip between New York and San Francisco. The full suite of experiments needed to build and train that AI language system from scratch can generate even more: up to 78,000 pounds, depending on the source of power. But there are ways to make machine learning cleaner and greener, a movement that has been called "Green AI." Some algorithms are less power-hungry than others, for example, and many training sessions can be moved to remote locations that get most of their power from renewable sources.
The COVID crisis has skyrocketed the applications of artificial intelligence -- from tackling this global pandemic, to being a vital tool in managing various business processes. Despite its benefits, AI has always been scrutinised for its ethical concerns like existing biases and privacy issues. However, this technology also has some significant sustainability issues – it is known to consume a massive amount of energy, creating a negative impact on the environment. As AI technology is getting advanced in predicting weather, understanding human speech, enhancing banking payments, and revolutionising healthcare, the advanced models are not only required to be trained on large datasets, but also require massive computing power to improve its accuracy. Such heavy computing and processing consumes a tremendous amount of energy and emits carbon dioxide, which has become an environmental concern. According to a report, it has been estimated that the power required for training AI models emits approximately 626,000 pounds (284 tonnes) of carbon dioxide, which is comparatively five times the lifetime emissions of the average US car.