Total SA plans to start a digital factory in the coming weeks to tap artificial intelligence in a bid to save hundreds of millions of dollars on exploration and production projects, according to an executive. The use of artificial intelligence to screen geological data will help identify new prospects, and shorten the time to acquire licenses, drill and make discoveries, Arnaud Breuillac, head of E&P, said at a conference organized by IFP Energies Nouvelles in Paris on Friday. It will also help optimize the use of equipment and reduce maintenance costs, he said. The digital factory will employ between 200 and 300 engineers and build on successful North Sea pilot projects, Chief Executive Officer Patrick Pouyanne said at the same event. It will also be a way to attract "young talent" to the industry.
It is the near future. You wake in a house warmed by a heat pump that extracts energy from deep below the ground and delivers it to your home. You rise and make yourself a cup of tea – from water boiled on a hydrogen-burning kitchen stove. Then you head to work – in a robot-driven electric car directed by central control network to avoid traffic jams. At midday, you pause for lunch: a sandwich made of meat grown in a laboratory.
A natural gas fired turbine, manufactured by Caterpillar Inc. subsidiary Solar Turbines Inc., runs a compressor at a Williston Basin Interstate Pipeline Co., a subsidiary of MDU Resources Group Inc., natural gas compression station in Bismarck, North Dakota. In the world of oil and natural gas, engineers, geologists, and drilling and production departments tend to get the lion's share of the credit when good things happen, and most of the blame when they don't. That's fair, given the crucial roles these groups of employees play within the thousands of companies that make up the U.S. oil and gas industry. But in recent years, as overall domestic production has risen at a pace no one could have foreseen even five years ago, the credit has begun to shift. These human resources remain indispensable to the success of any company, but the deployment of a raft of advancing technologies has played an ever-advancing role over time in enabling companies to maximize recoveries and profits.
This is a constrained global optimization package built upon bayesian inference and gaussian process, that attempts to find the maximum value of an unknown function in as few iterations as possible. This technique is particularly suited for optimization of high cost functions, situations where the balance between exploration and exploitation is important. With the release of version 1.0.0 a number of API breaking changes were introduced. I understand this can be a headache for some, but these were necessary changes that needed to be done and ultimately made the package better. If you have used this package in the past I suggest you take the basic and advanced tours (found in the examples folder) in order to familiarize yourself with the new API.
Sometime in the early 2000s, while sitting in my dentist's chair, I began to wonder about the potential real-world pain that someone could potentially inflict on another human being simply by hacking the new digital x-ray system that the dentist had installed. Would it be possible, for example, for a hacker to modify the digital images from the x-rays so that the dentist would not be able to find and repair painful cavities, or to cause the dentist to perform an unnecessary root canal, filling, or other procedure? How certain could I be that the images of my own teeth were not tampered with? Several years later, when I had my a digital MRI after an auto accident, I wondered even further – could hackers modify images in such a manner so as to cause a person to have his head cut open to remove a tumor when, in fact, he had no tumors? Or to cause a scan to appear normal when the victim actually had a life threatening condition requiring immediate attention?
The TriRhenaTech alliance universities and their partners presented their competences in the field of artificial intelligence and their cross-border cooperations with the industry at the tri-national conference 'Artificial Intelligence : from Research to Application' on March 13th, 2019 in Offenburg. The TriRhenaTech alliance is a network of universities in the Upper Rhine Trinational Metropolitan Region comprising of the German universities of applied sciences in Furtwangen, Kaiserslautern, Karlsruhe, and Offenburg, the Baden-Wuerttemberg Cooperative State University Loerrach, the French university network Alsace Tech (comprised of 14 'grandes \'ecoles' in the fields of engineering, architecture and management) and the University of Applied Sciences and Arts Northwestern Switzerland. The alliance's common goal is to reinforce the transfer of knowledge, research, and technology, as well as the cross-border mobility of students.
The oil price crash of 2014 and the global'digitalization and disruption' drive coincided in a rather bizarre way to push the oil industry to seek cost cuts through innovation and new technologies. Big Tech was only too pleased to help Big Oil, seeing a new revenue stream in an industry long thought to be of the'dinosaur' type that was too slow to embrace new ways of doing things. Many oil and gas firms, especially the world's biggest, are already using data analytics, cloud computing, digital oil fields, digital twins, robotics, automation, predictive maintenance, machine learning, and even AI. The technology giants have seized the opportunity to sell such services to Big Oil, and top managers at Amazon Web Services, Microsoft Azure, and ABB Group, to name a few, flocked to this week's top energy industry event CERAWeek by IHS Markit in Houston to pitch their solutions to a wider audience. "A great wave of innovation and technology is transforming the industry and reshaping the energy future," said Daniel Yergin, conference chair and vice chairman of IHS Markit.
As the amount of data created daily increases (already at 2.5 Quadrillion bytes a day allegedly ) ML techniques are allowing us to cluster, organise and appropriate this data into actionable information. This is especially true in the realm of Cyber Security. Don't be scared of the term Machine Learning, it really just means a computer that can learn to do something without being explicitly programmed for that task. The process typically involves training the machine to do a task (i.e. Let's have a quick look at some of the ways we encounter ML every day in Cyber Security.
Self-organization can be broadly defined as the ability of a system to display ordered spatio-temporal patterns solely as the result of the interactions among the system components. Processes of this kind characterize both living and artificial systems, making self-organization a concept that is at the basis of several disciplines, from physics to biology to engineering. Placed at the frontiers between disciplines, Artificial Life (ALife) has heavily borrowed concepts and tools from the study of self-organization, providing mechanistic interpretations of life-like phenomena as well as useful constructivist approaches to artificial system design. Despite its broad usage within ALife, the concept of self-organization has been often excessively stretched or misinterpreted, calling for a clarification that could help with tracing the borders between what can and cannot be considered self-organization. In this review, we discuss the fundamental aspects of self-organization and list the main usages within three primary ALife domains, namely "soft" (mathematical/computational modeling), "hard" (physical robots), and "wet" (chemical/biological systems) ALife. Finally, we discuss the usefulness of self-organization within ALife studies, point to perspectives for future research, and list open questions.
Reservoir Computing (RC) is a popular methodology for the efficient design of Recurrent Neural Networks (RNNs). Recently, the advantages of the RC approach have been extended to the context of multi-layered RNNs, with the introduction of the Deep Echo State Network (DeepESN) model. In this paper, we study the quality of state dynamics in progressively higher layers of DeepESNs, using tools from the areas of information theory and numerical analysis. Our experimental results on RC benchmark datasets reveal the fundamental role played by the strength of inter-reservoir connections to increasingly enrich the representations developed in higher layers. Our analysis also gives interesting insights into the possibility of effective exploitation of training algorithms based on stochastic gradient descent in the RC field.