Collaborating Authors

Power Industry

Productionising AI knowledge management


Using AI for knowledge management is a great way to industrialise years of innovation on a company-wide level, writes Dr Warrick Cooke, Consultant at Tessella. An engineer who has worked in the same place – a factory, oil rig, nuclear power plant – for 20 years will be an expert in that facility. Their been-there-done-that experience means they can quickly make good decisions on the best response to a wide range of scenarios. That knowledge would be hugely valuable to others. It is also knowledge that will be lost when they move on.

Deep learning for load balancing of SDN‐based data center networks


With the development of new communication technologies, the amount of data transmission has increased gradually. To satisfy this increasing computing resource demand effectively, the number of data center networks (DCNs), which are structures composed of servers connected with well‐organized‐switches, has increased worldwide. However, traditional switches do not efficiently satisfy the needs of DCNs. In recent years, an emerging networking architecture software‐defined network (SDN) has been proposed to manage the DCNs to control network switches and to deploy new network protocols. However, the main challenge in DCNs is to balance the load among servers.

Optimising processes with artificial intelligence


MECOMS is an IT solution provider specialised in the utility sector and focused on Microsoft and Azure technologies. We're investigating how artificial intelligence can help the different players in the utility market run their business processes in a more efficient way, using different techniques like machine learning and advanced chatbots. For distribution grid operators, we see interesting opportunities in the area of machine learning. We came up with the idea of using machine learning to help run the process of validation of meter readings that arrive as raw data at the IT system. Machine learning could be used to observe whether there is a consistent pattern in the reasons why meter readings run into validation errors.

How Does a Smart Grid Improve Efficiency Using Artificial Intelligence?


Artificial Intelligence is one of the most capable and potential technologies of all time.

How artificial intelligence can tackle climate change


Climate change is the biggest challenge facing the planet. It will need every solution possible, including technology like artificial intelligence (AI). Seeing a chance to help the cause, some of the biggest names in AI and machine learning--a discipline within the field--recently published a paper called "Tackling Climate Change with Machine Learning." The paper, which was discussed at a workshop during a major AI conference in June, was a "call to arms" to bring researchers together, said David Rolnick, a University of Pennsylvania postdoctoral fellow and one of the authors. "It's surprising how many problems machine learning can meaningfully contribute to," says Rolnick, who also helped organize the June workshop. The paper offers up 13 areas where machine learning can be deployed, including energy production, CO2 removal, education, solar geoengineering, and finance.

Congratulations to the #AAAI2021 best paper winners


The AAAI-21 best paper awards were announced on Thursday 4th February during the opening ceremony of AAAI 2021. There were three best papers, three best paper runners-up, and six distinguished papers. Many real-world applications require the prediction of long sequence time-series, such as electricity consumption planning. Long sequence time-series forecasting (LSTF) demands a high prediction capacity of the model, which is the ability to capture precise long-range dependency coupling between output and input efficiently. Recent studies have shown the potential of Transformer to increase the prediction capacity.

Fukushima College robot wins top prize for nuclear decommissioning

The Japan Times

Fukushima – A robot created by a team from a technology college in northeastern Japan recently won the top prize in a robotics competition that had the theme of decommissioning the Fukushima No. 1 nuclear power plant. The Mehikari robot of Fukushima College earned praise for its speed as well as ability to employ different methods to retrieve mock debris similar in size to that at the plant, the site of a nuclear disaster triggered by a massive earthquake and tsunami on March 11, 2011. The robot completed the set task in about 2 minutes, the fastest time, in the annual competition aimed at fostering future engineers that was attended by students from 13 colleges belonging to the National Institute of Technology. Sunday's competition was the fifth of its kind. Students in 14 teams from the colleges across the country such as in Osaka and Kumamoto prefectures were tasked this year with developing robots to remove fuel debris from the plant, organizers said.

Spatial Network Decomposition for Fast and Scalable AC-OPF Learning Artificial Intelligence

This paper proposes a novel machine-learning approach for predicting AC-OPF solutions that features a fast and scalable training. It is motivated by the two critical considerations: (1) the fact that topology optimization and the stochasticity induced by renewable energy sources may lead to fundamentally different AC-OPF instances; and (2) the significant training time needed by existing machine-learning approaches for predicting AC-OPF. The proposed approach is a 2-stage methodology that exploits a spatial decomposition of the power network that is viewed as a set of regions. The first stage learns to predict the flows and voltages on the buses and lines coupling the regions, and the second stage trains, in parallel, the machine-learning models for each region. Experimental results on the French transmission system (up to 6,700 buses and 9,000 lines) demonstrate the potential of the approach. Within a short training time, the approach predicts AC-OPF solutions with very high fidelity and minor constraint violations, producing significant improvements over the state-of-the-art. The results also show that the predictions can seed a load flow optimization to return a feasible solution within 0.03% of the AC-OPF objective, while reducing running times significantly.

Privacy Protection of Grid Users Data with Blockchain and Adversarial Machine Learning Artificial Intelligence

Utilities around the world are reported to invest a total of around 30 billion over the next few years for installation of more than 300 million smart meters, replacing traditional analog meters [1]. By mid-decade, with full country wide deployment, there will be almost 1.3 billion smart meters in place [1]. Collection of fine grained energy usage data by these smart meters provides numerous advantages such as energy savings for customers with use of demand optimization, a billing system of higher accuracy with dynamic pricing programs, bidirectional information exchange ability between end-users for better consumer-operator interaction, and so on. However, all these perks associated with fine grained energy usage data collection threaten the privacy of users. With this technology, customers' personal data such as sleeping cycle, number of occupants, and even type and number of appliances stream into the hands of the utility companies and can be subject to misuse. This research paper addresses privacy violation of consumers' energy usage data collected from smart meters and provides a novel solution for the privacy protection while allowing benefits of energy data analytics. First, we demonstrate the successful application of occupancy detection attacks using a deep neural network method that yields high accuracy results. We then introduce Adversarial Machine Learning Occupancy Detection Avoidance with Blockchain (AMLODA-B) framework as a counter-attack by deploying an algorithm based on the Long Short Term Memory (LSTM) model into the standardized smart metering infrastructure to prevent leakage of consumers personal information. Our privacy-aware approach protects consumers' privacy without compromising the correctness of billing and preserves operational efficiency without use of authoritative intermediaries.

A Tensor-Based Formulation of Hetero-functional Graph Theory Artificial Intelligence

Recently, hetero-functional graph theory (HFGT) has developed as a means to mathematically model the structure of large flexible engineering systems. In that regard, it intellectually resembles a fusion of network science and model-based systems engineering. With respect to the former, it relies on multiple graphs as data structures so as to support matrix-based quantitative analysis. In the meantime, HFGT explicitly embodies the heterogeneity of conceptual and ontological constructs found in model-based systems engineering including system form, system function, and system concept. At their foundation, these disparate conceptual constructs suggest multi-dimensional rather than two-dimensional relationships. This paper provides the first tensor-based treatment of some of the most important parts of hetero-functional graph theory. In particular, it addresses the "system concept", the hetero-functional adjacency matrix, and the hetero-functional incidence tensor. The tensor-based formulation described in this work makes a stronger tie between HFGT and its ontological foundations in MBSE. Finally, the tensor-based formulation facilitates an understanding of the relationships between HFGT and multi-layer networks.