Day three of ADIPEC 2019 has just concluded here in Abu Dhabi, UAE and much was said about oil demand concerns. Morning discourse was coloured by the International Energy Agency's take that demand is set to plateau by 2030 due to a pick up in the use of electric vehicles around the world. In its latest market projections, the IEA said overall demand for energy is set to increase by 1% every year until 2040, however headline demand will plateau ten years earlier than it had previously forecast. Elsewhere in its World Energy Outlook report, the IEA said US shale output, which has made the country the world's biggest oil producer, is likely to stay higher for longer than previously projected, with the country accounting for 85% of the increase in global oil production by 2030, and for 30% of the increase in natural gas. Meanwhile, switching tack to the coming 12 months, OPEC Secretary General Mohammed Barkindo said an uptick in demand for 2020 may be on the cards should the US-China trade stand-off end.
In this pattern, learn how to create and deploy deep learning models by using a Jupyter Notebook in an IBM Watson Studio environment. You also create deep learning experiments with hyperparameters optimization by using a Watson Studio GUI for monitoring different runs, then select the best model for deployment. Computer vision is on the rise, and there might be scenarios where a machine must classify images based on their class to aid in the decision-making process. In this code pattern, we demonstrate how to do multiclass classification (with three classes) by using IBM Watson Studio and IBM Deep Learning as a Service. We use yoga postures data to identify the class given an image.
Natural gas is an important energy source for much of our industrial, heating and electricity needs. The price of natural gas can fluctuate greatly. I made a time series analysis with external regressors to investigate how well modeling could forecast the price of natural gas. Using data from the US Energy Information Administration, I acquired monthly pricing data for Natural Gas from January of 1990 until present. I also acquired data on a number of related energy features.
Despite my incessant buzzword bashing, I'll concede this much: it's important to grapple with next-gen tech via experts who actually know what they are talking about. We got an earful on day one of the Constellation Research Connected Enterprise 2019 event. How quantum computing could (someday) break 2048-bit RSA encryption https://t.co/o5EfaqgcZN "New study shows quantum tech will catch up with today's encryption standards sooner than expected" pic.twitter.com/yfOgi9lXoj Still, next-gen tech needs to be held to the fire of project results.
The Abu Dhabi National Oil Company (ADNOC) is transforming its business through digital projects that range from deciding where to drill for oil and gas, to helping the company decide where to sell its final products. The state-owned oil company has driven the United Arab Emirates' economy since it was founded almost half a century ago, and its head of digital, Abdul Nasser Al Mughairbi, has been driving digital transformation since 2017. Each day, ADNOC produces three million barrels of oil and processes billions of cubic feet of gas. It has businesses involved in the extraction of raw materials upstream as well as the processing of materials to add value downstream. Add to this the transportation, sales and marketing of oil and gas, and you have a large, complex organisation.
Kongsberg Digital signed an agreement to digitalise the Nyhamna facility, a gas processing and export hub for Ormen Lange and other fields connected to the Polarled pipeline. A/S Norske Shell is entering the partnership as operator of Ormen Lange and on behalf of Gassco as the operator of Nyhamna. The value of the contract scope, with a digital twin, is approximately $11mn, with deliverables starting from Q4 2019. It will utilise the Kognifai Dynamic Digital Twin to create a virtual representation of the gas plant and its behavior – continuously updated with integrated information reflecting the status of the facility in real time. As the technical service provider at Nyhamna, Shell will be equipped with the ability to simulate scenarios and uncover new options for optimisation of its real-life counterpart.
To help them in their decision-making and implementation, the time has come for another IoTSWC (IoT Solutions World Congress), the international flagship event that will bring together more than 350 exhibitors, including the world's leading suppliers of IoT, artificial intelligence and blockchain solutions. A new feature of this year's fair will be a specific area called IoT Solutions.Font, which will provide visibility for start-ups with original and innovative IoT, Artificial Intelligence, and Blockchain based products and services that have already been tested in the market and with potential for internationalisation. This year the following will be on display: an application that measures the driver's behaviour to update the cost of insurance policies; drones, sensors and blockchain that monitor the water quality of the Volga River; an autonomous electric car equipped with a cybersecurity system that blocks attacks that jeopardize the reliability or privacy of the vehicle; a solution to check gas distribution networks, reducing energy losses and preventing fraud, a platform that combines IoT, artificial intelligence and 5G to provide predictive medical care and handle emergencies affecting the elderly and chronically ill; a system to inspect and repair wind farm turbines using drones; artificial intelligence and the cloud; an assembly assistance system which accurately guides the employees of a factory through the different steps to be performed; an application to enable farmers to view the status of their holdings in real time and facilitate their decision-making; and software based on artificial intelligence for submersible pumps used in oil wells. The latter two have their own monographic forums: Blockchain Solutions World (BSW) and AI & Cognitive Systems Forum (AI & CS) with the aim of deepening these two technologies that enhance and reinvent the internet of things capabilities.
This paper tackles the challenge presented by small-data to the task of Bayesian inference. A novel methodology, based on manifold learning and manifold sampling, is proposed for solving this computational statistics problem under the following assumptions: 1) neither the prior model nor the likelihood function are Gaussian and neither can be approximated by a Gaussian measure; 2) the number of functional input (system parameters) and functional output (quantity of interest) can be large; 3) the number of available realizations of the prior model is small, leading to the small-data challenge typically associated with expensive numerical simulations; the number of experimental realizations is also small; 4) the number of the posterior realizations required for decision is much larger than the available initial dataset. The method and its mathematical aspects are detailed. Three applications are presented for validation: The first two involve mathematical constructions aimed to develop intuition around the method and to explore its performance. The third example aims to demonstrate the operational value of the method using a more complex application related to the statistical inverse identification of the non-Gaussian matrix-valued random elasticity field of a damaged biological tissue (osteoporosis in a cortical bone) using ultrasonic waves.
Recent work has shown how to embed differentiable optimization problems (that is, problems whose solutions can be backpropagated through) as layers within deep learning architectures. This method provides a useful inductive bias for certain problems, but existing software for differentiable optimization layers is rigid and difficult to apply to new settings. In this paper, we propose an approach to differentiating through disciplined convex programs, a subclass of convex optimization problems used by domain-specific languages (DSLs) for convex optimization. We introduce disciplined parametrized programming, a subset of disciplined convex programming, and we show that every disciplined parametrized program can be represented as the composition of an affine map from parameters to problem data, a solver, and an affine map from the solver's solution to a solution of the original problem (a new form we refer to as affine-solver-affine form). We then demonstrate how to efficiently differentiate through each of these components, allowing for end-to-end analytical differentiation through the entire convex program. We implement our methodology in version 1.1 of CVXPY, a popular Python-embedded DSL for convex optimization, and additionally implement differentiable layers for disciplined convex programs in PyTorch and TensorFlow 2.0. Our implementation significantly lowers the barrier to using convex optimization problems in differentiable programs. We present applications in linear machine learning models and in stochastic control, and we show that our layer is competitive (in execution time) compared to specialized differentiable solvers from past work.
In the geophysical field, seismic noise attenuation has been considered as a critical and long-standing problem, especially for the pre-stack data processing. Here, we propose a model to leverage the deep-learning model for this task. Rather than directly applying an existing de-noising model from ordinary images to the seismic data, we have designed a particular deep-learning model, based on residual neural networks. It is named as N2N-Seismic, which has a strong ability to recover the seismic signals back to intact condition with the preservation of primary signals. The proposed model, achieving with great success in attenuating noise, has been tested on two different seismic datasets. Several metrics show that our method outperforms conventional approaches in terms of Signal-to-Noise-Ratio, Mean-Squared-Error, Phase Spectrum, etc. Moreover, robust tests in terms of effectively removing random noise from any dataset with strong and weak noises have been extensively scrutinized in making sure that the proposed model is able to maintain a good level of adaptation while dealing with large variations of noise characteristics and intensities.