Goto

Collaborating Authors

upstream oil & gas


The Future of Plunger Lift Control Using Artificial Intelligence

#artificialintelligence

Dozens of plunger lift control algorithms have been developed to account for different well conditions and optimization protocols. However, challenges exist that prevent optimization at scale. To address these challenges, a plunger lift optimization software was developed. One aspect of this software is enabling set-point optimization at scale.


Portfolio Management using Python -- Portfolio Optimization

#artificialintelligence

Portfolio optimization is the process of choosing the best portfolio among the set of all portfolios. The naive way is to select a group of random allocations and figure out which one has the best Sharpe Ratio. This is known as the Monte Carlo Simulation where randomly a weight is assigned to each security in the portfolio and then the mean daily return and standard deviation of daily return is calculated. This helps in calculating the Sharpe Ratio for randomly selected allocations. But the naive way is time taking so an optimization algorithm is used which works on the concept of the minimizer.


Productionising AI knowledge management

#artificialintelligence

Using AI for knowledge management is a great way to industrialise years of innovation on a company-wide level, writes Dr Warrick Cooke, Consultant at Tessella. An engineer who has worked in the same place – a factory, oil rig, nuclear power plant – for 20 years will be an expert in that facility. Their been-there-done-that experience means they can quickly make good decisions on the best response to a wide range of scenarios. That knowledge would be hugely valuable to others. It is also knowledge that will be lost when they move on.


AI could have profound effect on way GCHQ works, says director

The Guardian

GCHQ's director has said artificial intelligence software could have a profound impact on the way it operates, from spotting otherwise missed clues to thwart terror plots to better identifying the sources of fake news and computer viruses. Jeremy Fleming's remarks came as the spy agency prepared to publish a rare paper on Thursday defending its use of machine-learning technology to placate critics concerned about its bulk surveillance activities. "AI, like so many technologies, offers great promise for society, prosperity and security. Its impact on GCHQ is equally profound," he said. "While this unprecedented technological evolution comes with great opportunity, it also poses significant ethical challenges for all of society, including GCHQ." AI is considered controversial because it relies on computer algorithms to make decisions based on patterns found in data.


Positive Reinforcements Help Algorithm Forecast Underground Natural Reserves

#artificialintelligence

Texas A&M University and University of Oklahoma researchers have designed a reinforcement-based algorithm that automates the prediction of underground oil and gas reserves. Texas A&M University (TAMU) and University of Oklahoma researchers have developed a reinforcement-based algorithm that automates forecasting of subterranean properties, enabling accurate prediction of oil and gas reserves. The algorithm focuses on the correct characterization of the underground environment based on rewards accumulated for making correct predictions of pressure and flow anticipated from boreholes. The TAMU team learned that within 10 iterations of reinforcement learning, the algorithm could correctly and rapidly predict the properties of simple subsurface scenarios. TAMU's Siddharth Misra said, "We have turned history matching into a sequential decision-making problem, which has the potential to reduce engineers' efforts, mitigate human bias, and remove the need of large sets of labeled training data."


A trusty robot to carry farms into the future

#artificialintelligence

Farming is a tough business. Global food demand is surging, with as many as 10 billion mouths to feed by 2050. At the same time, environmental challenges and labor limitations have made the future uncertain for agricultural managers. A new company called Future Acres proposed to enable farmers to do more with less through the power of robots. The company, helmed by CEO Suma Reddy, who previously served as COO and co-founder at Farmself and has held multiple roles and lead companies focused on the agtech space, has created an autonomous, electric agricultural robotic harvest companion named Carry to help farmers gather hand-picked crops faster and with less physical demand. Automation has been playing an increasingly large role in agriculture, and agricultural robots are widely expected to play a critical role in food production going forward.


A trusty robot to carry farms into the future

ZDNet

Farming is a tough business. Global food demand is surging, with as many as 10 billion mouths to feed by 2050. At the same time, environmental challenges and labor limitations have made the future uncertain for agricultural managers. A new company called Future Acres proposed to enable farmers to do more with less through the power of robots. The company, helmed by CEO Suma Reddy, who previously served as COO and co-founder at Farmself and has held multiple roles and lead companies focused on the agtech space, has created an autonomous, electric agricultural robotic harvest companion named Carry to help farmers gather hand-picked crops faster and with less physical demand. Automation has been playing an increasingly large role in agriculture, and agricultural robots are widely expected to play a critical role in food production going forward.


Physics-constrained deep learning of building thermal dynamics

AIHub

Energy-efficient buildings are one of the top priorities to sustainably address the global energy demands and reduction of CO2 emissions. Advanced control strategies for buildings have been identified as a potential solution with projected energy saving potential of up to 28%. However, the main bottleneck of the model-free methods such as reinforcement learning (RL) is the sampling inefficiency and thus requirement for large datasets, which are costly to obtain or often not available in the engineering practice. On the other hand, model-based methods such as model predictive control (MPC) suffer from large cost associated with the development of the physics-based building thermal dynamics model. We address the challenge of developing cost and data-efficient predictive models of a building's thermal dynamics via physics-constrained deep learning.


Soft robots for ocean exploration and offshore operations: A perspective

Robohub

Most of the ocean is unknown. Yet we know that the most challenging environments on the planet reside in it. Understanding the ocean in its totality is a key component for the sustainable development of human activities and for the mitigation of climate change, as proclaimed by the United Nations. We are glad to share our perspective about the role of soft robots in ocean exploration and offshore operations at the outset of the ocean decade (2021-2030). In this study of the Soft Systems Group (part of The School of Engineering at The University of Edinburgh), we focus on the two ends of the water column: the abyss and the surface.


Consistency of random-walk based network embedding algorithms

arXiv.org Machine Learning

Random-walk based network embedding algorithms like node2vec and DeepWalk are widely used to obtain Euclidean representation of the nodes in a network prior to performing down-stream network inference tasks. Nevertheless, despite their impressive empirical performance, there is a lack of theoretical results explaining their behavior. In this paper we studied the node2vec and DeepWalk algorithms through the perspective of matrix factorization. We analyze these algorithms in the setting of community detection for stochastic blockmodel graphs; in particular we established large-sample error bounds and prove consistent community recovery of node2vec/DeepWalk embedding followed by k-means clustering. Our theoretical results indicate a subtle interplay between the sparsity of the observed networks, the window sizes of the random walks, and the convergence rates of the node2vec/DeepWalk embedding toward the embedding of the true but unknown edge probabilities matrix. More specifically, as the network becomes sparser, our results suggest using larger window sizes, or equivalently, taking longer random walks, in order to attain better convergence rate for the resulting embeddings. The paper includes numerical experiments corroborating these observations.