Though not an alternative to human knowledge and ingenuity, AI is considered a supporting tool to help humans. Though Al presently has a tough time completing different tasks involving common sense in the real world, it is able to process large amounts of data faster compared to a human brain. Artificial intelligence systems play a vital role when unstructured data like images, social media or open-ended surveys are needed to make a decision. Amazon, for instance, recommends products to buyers before even they search for that particular product. Amazon has made this possible using machine learning techniques and now layer in unstructured data on top of its powerful, integrated collection of structured analytics such as product histories, addresses, and payment details of customers.
After several years of research on machine learning algorithms running on oil and gas production data, Solution Seeker has developed a hierarchical neural network model that improves the predictive power for real-time production optimization. The model leverages the power of neural network learning algorithms combined with domain knowledge in the form of first principle physics and production system logic. Hart Energy spoke with Vidar Gunnerud, CEO of Solution Seeker, about the company's technology and the importance of artificial intelligence (AI) to the oil and gas sector. How important is AI to the oil and gas industry? Gunnerud: I think AI is the future.
Industrial revolutions--from mechanization to electrification and mass production to increased automation--have long been about replacing human muscle with machines. For many factory workers who might face the threat of redundancy, that is scary enough. But the fourth revolution, which is more about replacing human brain power with artificial intelligence (AI), presents a change that many more workers are finding difficult to accept. AI can provide immediate impact for oil and gas companies--reduced expenses, increased productivity, improved work methods--but energy companies have been slow to adopt the technologies available. This might have to do with security concerns, cost, or even just a lack of understanding about the benefits to be gained.
Ever since software development progressed from compiler code, there have existed a range of tools to help make developing easier and more effective. A number of projects point in an interesting direction for the sector however. For instance, Amazon recently announced the launch of Cloud 9, an integrated development environment that directly connects to the cloud computing platform provided by the company. It's a strong sign that machine learning is becoming a strong presence in software development on the cloud. Developers using the platform can easily tap into the cloud-based AI baked into the software to create the next generation of apps.
The technology is expected to wipe out millions of jobs, but surprisingly create many more by 2020. Over two million jobs could be created by artificial intelligence (AI) by 2020, outnumbering the number of jobs lost to the technology, according to Gartner. In a recent report, Gartner revealed that by 2020 AI is expected to create a total of 2.3 million jobs, but destroy 1.8 million jobs. Though the latter is a significant amount, in the long term more jobs can be created and enhanced by the technology across the job ladder, according to the analyst firm. AI implementation will affect every industry, with manufacturing expected to be hit the hardest by the technology.
We use our experience with the Dipmeter Advisor system for well-log interpretation as a case study to examine the development of commercial expert systems. We discuss the nature of these systems as we see them in the coming decade, characteristics of the evolution process, development methods, and skills required in the development team. We argue that the tools and ideas of rapid prototyping and successive refinement accelerate the development process. We note that different types of people are required at different stages of expert system development: Those who are primarily knowledgeable in the domain, but who can use the framework to expand the domain knowledge; and those who can actually design and build expert system tools and components We also note that traditional programming skills continue to be required in the development of commercial expert systems Finally, we discuss the problem of technology transfer and compare our experience with some of the traditional wisdom of expert system development. We have observed during this effort that the development of a commercial expert system imposes a substantially different set of constraints and requirements in terms of characteristics and methods of development than those seen in the research environment.
Inherent batch-to-batch variability, aging, and contamination are major factors contributing to variability in oilfield cement-slurry performance. Of particular concern are problems encountered when a slurry is formulated with one cement sample and used with a batch having different properties. Such variability imposes a heavy burden on performance testing and is often a major factor in operational failure. We describe methods that allow the identification, characterization, and prediction of the variability of oilfield cements. Our approach involves predicting cement compositions, particlesize distributions, and thickening-time curves from the diffuse reflectance infrared Fourier transform spectrum of neat cement powders.
Early this year fifty people took an experimental course at Xerox PARC on knowledge programming in Loops During the course, they extended and debugged small knowledge systems in a simulated economics domain called Truckin Everyone learned how to use the Loops environment, formulated the knowledge for their own program, and represented it in Loops At the end of the course a knowledge competition was run so that the strategies used in the different systems could be compared The punchline to this story is that almost everyone learned enough about Loops to complete a small knowledge system in only three days. Although one must exercise caution in extrapolating from small experiments, the results suggest that there is substantial power in integrating multiple programming paradigms. We extend our special thanks to the course participants from Applied Expert Systems, Daisy Systems, ESL, Fairchild AI Lab, Lawrence-Livermore Laboratories, Schlumberger-Doll Research Laboratory, SRI International, Stanford University, Teknowledge, and Xerox Corporation Their participation and feedback are vital to the ongoing experimental process for simplifying the techniques of knowledge programming We enjoyed and will long remember their spirited involvement. As in many situations in life, pat solutions and simple mathematical models just aren't good enough. To cope with messiness, AI researchers have found that large amounts of problem-specific knowledge are usually needed.
Preventive-maintenance schedules occurring in industry are often suboptimal with regard to maintenance coallocation, loss-of-production costs, and availability. We describe the implementation and deployment of a software decision support tool for the maintenance planning of gas turbines, with the goal of reducing the direct maintenance costs and the often costly production losses during maintenance down time. The optimization problem is formally defined, and we argue that the feasibility version is NPcomplete. We outline a heuristic algorithm that can quickly solve the problem for practical purposes and validate the approach on a real-world scenario based on an oil production facility. We also compare the performance of our algorithm with results from using integer programming and discuss the deployment of the application.
Moreover, the system was designed from the beginning to be maintained on an ongoing basis without the involvement of senior knowledge engineers. In the manufacture of paper, wood is first pulped to separate its fibers. One of the predominant pulp processes is done in a kraft pulp mill and consists of cooking wood chips at elevated temperature and pressure in the presence of certain chemicals (alkali and sulfide), washing the resultant brown pulp, bleaching to make the pulp white, and drying the pulp for shipment to a paper mill. Pitch, or wood resin, is the material in wood that is insoluble in water but soluble in organic solvents. It usually makes up 14 percent of the weight of wood after the bark is removed and is often a sticky material.