Goto

Collaborating Authors

Results


Artificial intelligence drives the way to net-zero emissions

#artificialintelligence

The fourth industrial revolution (Industry 4.0) is already happening, and it's transforming the way manufacturing operations are carried out. Industry 4.0 is a product of the digital era as automation and data exchange in manufacturing technologies shift the central industrial control system to a smart setup that bridges the physical and digital world, addressed via the Internet of Things (IoT). Industry 4.0 is creating cyber-physical systems that can network a production process enabling value creation and real-time optimisation. The main factor driving the revolution is the advances in artificial intelligence (AI) and machine learning. The complex algorithms involved in AI use the data collected from cyber-physical systems, resulting in "smart manufacturing".


The Environmental Impact of AI

#artificialintelligence

Climate change has been a problem for many years. Climate change influences our health, cultivation, dwellings, security and employment. CO2 stands for carbon dioxide, which is found in the atmosphere and comes from natural sources and burning fossil fuels. They are followed by some solutions that researchers and developers can implement instantly to transform the future. AI has been the driving force for numerous sound transformations to the environment.


Chernobyl scientists want robots and drones to monitor radiation risk

New Scientist

Drones and robots could form part of a new radiation-monitoring system at the Chernobyl power station in Ukraine, as scientists at the plant fear that existing sensor networks built after the nuclear accident in 1986 have been at least partially destroyed by Russian troops. When Russia seized the Chernobyl plant in February, the sensors monitoring gamma radiation levels quickly went offline and most remained that way.


'India has one of the most sophisticated energy transmission systems'

#artificialintelligence

How smart are electrical grids in India? The grids in India don't have enough resiliency. There are interruptions, and flickering, etc. There's quite a lot of work that must be done to improve the power quality of the grids, partly because it has not kept pace with growing electricity demand. We also do not have a lot of redundancy in the grid yet due to the infra.


Rateless Codes for Near-Perfect Load Balancing in Distributed Matrix-Vector Multiplication

Communications of the ACM

Large-scale machine learning and data mining applications require computer systems to perform massive matrix-vector and matrix-matrix multiplication operations that need to be parallelized across multiple nodes. The presence of straggling nodes--computing nodes that unpredictably slow down or fail--is a major bottleneck in such distributed computations. Ideal load balancing strategies that dynamically allocate more tasks to faster nodes require knowledge or monitoring of node speeds as well as the ability to quickly move data. Recently proposed fixed-rate erasure coding strategies can handle unpredictable node slowdown, but they ignore partial work done by straggling nodes, thus resulting in a lot of redundant computation. We propose a rateless fountain coding strategy that achieves the best of both worlds--we prove that its latency is asymptotically equal to ideal load balancing, and it performs asymptotically zero redundant computations. Our idea is to create linear combinations of the m rows of the matrix and assign these encoded rows to different worker nodes. The original matrix-vector product can be decoded as soon as slightly more than m row-vector products are collectively finished by the nodes. Evaluation on parallel and distributed computing yields as much as three times speedup over uncoded schemes. Matrix-vector multiplications form the core of a plethora of scientific computing and machine learning applications that include solving partial differential equations, forward and back propagation in neural networks, computing the PageRank of graphs, etcetera. In the age of Big Data, most of these applications involve multiplying extremely large matrices and vectors and the computations cannot be performed efficiently on a single machine. This has motivated the development of several algorithms that seek to speed up matrix-vector multiplication by distributing the computation across multiple computing nodes.


Papers to Read on using Long Short Term Memory(LSTM) architecture in forecasting

#artificialintelligence

Abstract: The spread of COVID-19 has coincided with the rise of Graph Neural Networks (GNNs), leading to several studies proposing their use to better forecast the evolution of the pandemic. Many such models also include Long Short TermMemory (LSTM) networks, a common tool for time series forecasting. In this work, we further investigate the integration of these two methods by implementing GNNs within the gates of an LSTM and exploiting spatial information. In addition, we introduce a skip connection which proves critical to jointly capture the spatial and temporal patterns in the data. We validate our daily COVID-19 new cases forecast model on data of 37 European nations for the last 472 days and show superior performance compared to state-of-the-art graph time series models based on mean absolute scaled error (MASE).


Zapped: The grid is on life support. Can AI fix it?

ZDNet

America's electric system is long overdue for an overhaul. With a 2021 American Society of Civil Engineering report finding that 70% of T&D lines are over 25 years old, it's no shock that large, sustained outages are occurring with increased frequency throughout the country. Last year, major outages in California and Texas were both triggered by extreme weather events, causing local power demand to exceed supply. With climate change fueling extreme weather events, plant and city managers are increasingly turning to AI technologies to predict energy consumption levels days in advance, mitigating the potential of power outage incidents and increasing overall power grid reliability. To understand the problems facing the current power grid, which by one conception constitutes the largest and most complicated machine in the world, I reached out to Steve Kwan, Director of Product Management at Beyond Limits, which develops industrial AI for growth in a variety of industries.


Endeavour Energy showcases 5G drones for electricity grid repair

ZDNet

Endeavour Energy, together with Optus, Amazon Web Services, and Unleash live, has deployed its first 5G and AI-enabled drones to improve restoration times for unplanned electricity outages, particularly during natural disasters such as storms, floods, and bushfires. As part of the first demonstration, Endeavour Energy flew the drones over physical electricity infrastructure located in Sydney's western suburb of St Marys. During the flyover, footage of damaged assets was streamed in real-time using 5G to Endeavour Energy's training ground in Hoxton Park. With the demonstration a success, according to Optus, Endeavour Energy will now deploy the solution across infrastructure assets in Penrith and Blacktown, which would remove the need to use a large fleet of vehicles, helicopters, and technicians to physically identify and carry out remediation. "We're thrilled to work with Optus, AWS, and Unleash live, with the support of the Australian government to expedite the use of 5G drone technology to make faster decisions and expedite critical maintenance to continue to keep the lights on for our customers," Endeavour Energy chief asset and operating officer Scott Ryan said.


Building Machine Learning Infrastructure at Netflix and beyond

#artificialintelligence

Savin Goyal is CTO and co-founder of Outerbounds, a startup building infrastructure to help teams streamline how they build machine learning applications. Prior to starting Outerbounds, Savin and team worked at Netflix, where they were instrumental in the creation and release of Metaflow, an open source Python framework that addresses some of the challenges data scientists face around scalability and version control. The machine learning universe is really fast moving. So how can we make sure that we're not making a bet, that would hinder our progress, two years or four years further down the line. Deep learning is super popular, but tomorrow there could be a new way of doing machine learning.


The need of AI Risk Managers in organizations: AI is not a risk-free asset

#artificialintelligence

There are several risks involved in dealing with Artificial Intelligence (AI). In the globalized world, however, such methodology choices can eventually snowball into much greater economic risks. Let me make a case for an AI Risk Manager in organizations, preferably sitting in the Risk Management Department (RMD) if not compliance. The series of recent AI mishaps have further ignited the debate. A person from Michigan sued the Detroit police after being falsely arrested and falsely identified as a shoplifting suspect by the department's facial recognition software.