Goto

Collaborating Authors

Power Industry


Managing disaster and disruption with AI, one tree at a time

ZDNet

It sounds like a contradiction in terms, but disaster and disruption management is a thing. Disaster and disruption are precisely what ensues when catastrophic natural events occur, and unfortunately, the trajectory the world is on seems to be exacerbating the issue. In 2021 alone, the US experienced 15 weather/climate disaster events with damages exceeding $1 billion. Previously, we have explored various aspects of the ways data science and machine learning intertwine with natural events -- from weather prediction to the impact of climate change on extreme phenomena and measuring the impact of disaster relief. AiDash, however, is aiming at something different: helping utility and energy companies, as well as governments and cities, manage the impact of natural disasters, including storms and wildfires.


Hitachi Energy's new AI solution analyzes trees to prevent wildfires

#artificialintelligence

The massive, beautiful tree canopies in the Western U.S., which may grow perilously close to power lines, can quickly spark destructive wildfires. In fact, 70% of electrical outages are caused by vegetation, and this number has increased by 19% year over year from 2009-2020. The second-largest wildfire in California's history, The Dixie Fire, sparked when power lines came into contact with a fir tree. Could AI-driven solutions help prevent wildfires before they start by analyzing the tree growth that can spark them? Hitachi Energy, the Zurich, Switzerland-based global technology company, says yes. Hitachi Energy, formerly known as Hitachi ABB Power Grids (the name was changed last October) is currently focused on "powering good for a sustainable energy future."


An AI power play: Fueling the next wave of innovation in the energy sector

#artificialintelligence

Tatum, Texas might not seem like the most obvious place for a revolution in artificial intelligence (AI), but in October of 2020, that's exactly what happened. That was when Wayne Brown, the operations manager at the Vistra-owned Martin Lake Power Plant, built and deployed a heat rate optimizer (HRO). Vistra Corp. is the largest competitive power producer in the United States and operates power plants in 12 states with a capacity of more than 39,000 megawatts of electricity--enough to power nearly 20 million homes. Vistra has committed to reducing emissions by 60 percent by 2030 (against a 2010 baseline) and achieving net-zero emissions by 2050. To achieve its goals, the business is increasing efficiency in all its power plants and transforming its generation fleet by retiring coal plants and investing in solar- and battery-energy storage, which includes the world's largest grid-scale battery energy-storage facility.


Artificial intelligence drives the way to net-zero emissions

#artificialintelligence

The fourth industrial revolution (Industry 4.0) is already happening, and it's transforming the way manufacturing operations are carried out. Industry 4.0 is a product of the digital era as automation and data exchange in manufacturing technologies shift the central industrial control system to a smart setup that bridges the physical and digital world, addressed via the Internet of Things (IoT). Industry 4.0 is creating cyber-physical systems that can network a production process enabling value creation and real-time optimisation. The main factor driving the revolution is the advances in artificial intelligence (AI) and machine learning. The complex algorithms involved in AI use the data collected from cyber-physical systems, resulting in "smart manufacturing".


The Environmental Impact of AI

#artificialintelligence

Climate change has been a problem for many years. Climate change influences our health, cultivation, dwellings, security and employment. CO2 stands for carbon dioxide, which is found in the atmosphere and comes from natural sources and burning fossil fuels. They are followed by some solutions that researchers and developers can implement instantly to transform the future. AI has been the driving force for numerous sound transformations to the environment.


Chernobyl scientists want robots and drones to monitor radiation risk

New Scientist

Drones and robots could form part of a new radiation-monitoring system at the Chernobyl power station in Ukraine, as scientists at the plant fear that existing sensor networks built after the nuclear accident in 1986 have been at least partially destroyed by Russian troops. When Russia seized the Chernobyl plant in February, the sensors monitoring gamma radiation levels quickly went offline and most remained that way.


'India has one of the most sophisticated energy transmission systems'

#artificialintelligence

How smart are electrical grids in India? The grids in India don't have enough resiliency. There are interruptions, and flickering, etc. There's quite a lot of work that must be done to improve the power quality of the grids, partly because it has not kept pace with growing electricity demand. We also do not have a lot of redundancy in the grid yet due to the infra.


Rateless Codes for Near-Perfect Load Balancing in Distributed Matrix-Vector Multiplication

Communications of the ACM

Large-scale machine learning and data mining applications require computer systems to perform massive matrix-vector and matrix-matrix multiplication operations that need to be parallelized across multiple nodes. The presence of straggling nodes--computing nodes that unpredictably slow down or fail--is a major bottleneck in such distributed computations. Ideal load balancing strategies that dynamically allocate more tasks to faster nodes require knowledge or monitoring of node speeds as well as the ability to quickly move data. Recently proposed fixed-rate erasure coding strategies can handle unpredictable node slowdown, but they ignore partial work done by straggling nodes, thus resulting in a lot of redundant computation. We propose a rateless fountain coding strategy that achieves the best of both worlds--we prove that its latency is asymptotically equal to ideal load balancing, and it performs asymptotically zero redundant computations. Our idea is to create linear combinations of the m rows of the matrix and assign these encoded rows to different worker nodes. The original matrix-vector product can be decoded as soon as slightly more than m row-vector products are collectively finished by the nodes. Evaluation on parallel and distributed computing yields as much as three times speedup over uncoded schemes. Matrix-vector multiplications form the core of a plethora of scientific computing and machine learning applications that include solving partial differential equations, forward and back propagation in neural networks, computing the PageRank of graphs, etcetera. In the age of Big Data, most of these applications involve multiplying extremely large matrices and vectors and the computations cannot be performed efficiently on a single machine. This has motivated the development of several algorithms that seek to speed up matrix-vector multiplication by distributing the computation across multiple computing nodes.


Papers to Read on using Long Short Term Memory(LSTM) architecture in forecasting

#artificialintelligence

Abstract: The spread of COVID-19 has coincided with the rise of Graph Neural Networks (GNNs), leading to several studies proposing their use to better forecast the evolution of the pandemic. Many such models also include Long Short TermMemory (LSTM) networks, a common tool for time series forecasting. In this work, we further investigate the integration of these two methods by implementing GNNs within the gates of an LSTM and exploiting spatial information. In addition, we introduce a skip connection which proves critical to jointly capture the spatial and temporal patterns in the data. We validate our daily COVID-19 new cases forecast model on data of 37 European nations for the last 472 days and show superior performance compared to state-of-the-art graph time series models based on mean absolute scaled error (MASE).


Zapped: The grid is on life support. Can AI fix it?

ZDNet

America's electric system is long overdue for an overhaul. With a 2021 American Society of Civil Engineering report finding that 70% of T&D lines are over 25 years old, it's no shock that large, sustained outages are occurring with increased frequency throughout the country. Last year, major outages in California and Texas were both triggered by extreme weather events, causing local power demand to exceed supply. With climate change fueling extreme weather events, plant and city managers are increasingly turning to AI technologies to predict energy consumption levels days in advance, mitigating the potential of power outage incidents and increasing overall power grid reliability. To understand the problems facing the current power grid, which by one conception constitutes the largest and most complicated machine in the world, I reached out to Steve Kwan, Director of Product Management at Beyond Limits, which develops industrial AI for growth in a variety of industries.