Upstream Oil & Gas

Big data and machine learning for prediction of corrosion in pipelines - DNV GL - Software


We wanted to learn more about the opportunities and maturity level with technologies like big data, machine learning, artificial intelligence and the internet of things. For condition monitoring, these analytical tools can assist operators improve maintenance regimens and optimize inspection intervals by combining asset specific, industry, historical and real-time data into the data driven predictive and decision processes. "Big data" in the pipeline assessment business context includes the vast quantities of data coming from inspection devices full of sensors, such as pigs, and increasingly from embedded or remote sensors the most common which are catholic protection and corrosion coupons in addition we have a many types of asset property data, historical assessments, operational state, soils and environmental information which will exist in many formats, often unstructured such as in documents or photographs or other images. In the hackathon we taught our algorithms on previously pigged areas of pipelines where we had the inspection history to determine levels of corrosion as well as many forms of non-inspection information including those relating to pipe properties, corrosion protection history, coating type, local climate data, soil properties and previous field examination results.

These Non-Tech Firms Are Making Big Bets On Artificial Intelligence


While much has been written about information technology companies investing in artificial intelligence, Loup Ventures managing partner Doug Clinton notes that many non-tech companies are capitalizing on AI technology as well. "In 10 years, every company will have to be an artificial intelligence company or they won't be competitive," Clinton said. Idexx makes products for the animal health-care sector. Microsoft and Google, along with internet giants, have the inside track in monetizing artificial intelligence technology, Mizuho Securities said in a report earlier this month.

Log Data @CloudExpo #BigData #Analytics #ML #AI #DigitalTransformation


With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Join Cloud Expo / @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA for three days of intense Enterprise Cloud and'Digital Transformation' discussion and focus, including Big Data's indispensable role in IoT, Smart Grids and (IIoT) Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) Digital Transformation in Vertical Markets. Accordingly, attendees at the upcoming 20th Cloud Expo / @ThingsExpo June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA will find fresh new content in a new track called FinTech, which will incorporate machine learning, artificial intelligence, deep learning, and blockchain into one track. The upcoming 20th International @CloudExpo @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA announces that its Call For Papers for speaking opportunities is open.



Artificial intelligence (AI) might soon become one of the biggest competitive differentiators for these businesses. Within the oil and gas industry, AI and machine learning are already being used for processing high volume data and to achieve operational efficiency, said Arunkumar Ranganathan, associate vice president and head of the domain and process consulting groups for energy, utilities, and services at technology consulting firm Infosys. "AI techniques are yet to be applied [for] interpreting geophysical and geological functions and in other core business functions," Ranganathan said. For example, AI is being used to optimize the drilling process and improve operational efficiency, leading to a reduction in drilling costs.

Making Sense of IIoT Analytics


Beyond specific domain expertise, newcomers might also have an edge working with Big Data technologies that are essential for applying analytics to IIoT data at scale, says Tara Prakriya, chief product officer at Maana, an IIoT analytics company focused primarily on the industrial and oil and gas sectors. Maana employs patented semantic search capabilities, advanced algorithms, deep learning and something it calls a knowledge graph to extract information from time series data silos as well as domain experts, applications, data warehouses and Big Data stores to deliver predictive--and, more significantly, prescriptive--insights that can help manufacturers maximize plant floor productivity or stoke profitability. GE is also adding to its analytics coffers through acquisition--its most recent being Bit Stew, which employs machine intelligence and Big Data technologies like NoSQL to tackle the IIoT data integration problem at scale. He says Siemens' MindSphere platform as a service (PaaS) provides the device management, connectivity, data storage and infrastructure capabilities that will enable manufacturers to scale IIoT analytics beyond manufacturing use cases to next-generation business.

Researchers use Kinect to scan T. rex skull

MIT News

Upon discovering that their high-resolution dental scanners couldn't handle a jaw as big as a tyrannosaur's, they contacted the Camera Culture group at MIT's Media Lab, which had recently made headlines with a prototype system for producing high-resolution 3-D scans. The prototype wasn't ready for a job that big, however, so Camera Culture researchers used $150 in hardware and some free software to rig up a system that has since produced a 3-D scan of the entire five-foot-long T. rex skull, which a team of researchers -- including dentists, anthropologists, veterinarians, and paleontologists -- is using to analyze the holes. Free software called MeshLab analyzes the point cloud and infers the shape of the surfaces that produced it. In ongoing work, Das, Murmann, Cohrn, Raskar, and a team of collaborators including the Wisconsin paleontologists, are looking at fragmentation patterns at the edges of the holes and at the holes' depths and diameters, to see if they can infer anything about the shape, hardness, and velocity of whatever object might have caused them.

Please Don't Hire a Chief Artificial Intelligence Officer


For these companies, investment in AI may help solve real business problems but will not become part of customer facing products. As with earlier technologies, we are now hearing advice about "AI strategies" and how companies should hire Chief AI Officers. This well-educated, well-paid, and highly motivated individual will comb your organization looking for places to apply AI technologies, effectively making the goal to use AI rather than to solve real problems. Specific technologies provide specific functions and have specific data requirements.

Machine Learning Goes Viral In Oil Patch


Using historical data from compressors with maintenance problems, the software pinpointed patterns and put online a system to inform the operator when there would be problems, Beck said. The technology has been used by a large oil and gas company to gain insight about the drillbit downhole. That's where column analytics--software tools that meld predictive models with collected data--come into play. Looking at the investor presentation slide decks of some of industry's large independents and integrated oil companies, Beck said companies are talking about how they are using data analytics to improve efficiency in the oil field, including in the Permian Basin.

Some Image and Video Processing: Motion Estimation with Block-Matching in Videos, Noisy and Motion-blurred Image Restoration with Inverse Filter in Python and OpenCV


The following figure shows how the quality of the transformed image decreases when compared to the original image, when an nxn LPF is applied and how the quality (measured in terms of PSNR) degrades as n (LPF kernel width) increases. As we go on increasing the kernel size, the quality fo the final image obtained by down/up sampling the original image decreases as n increases, as shown in the following figure. The first one is the video of some students working on a university corridor, as shown below (obtained from youtube), extract some consecutive frames, mark a face in one image and use that image to mark all thew faces om the remaining frames that are consecutive to each other, thereby mark the entire video and estimate the motion using the simple block matching technique only. The following figure shows the frame with the face marked, now we shall use this image and block matching technique to estimate the motion of the student in the video, by marking his face in all the consecutive frames and reconstructing the video, as shown below.. As can be seen from the following figure, the optimal median filter size is 5 5, which generates the highest quality output, when compared to the original image.

How to build a highly effective AI team


"In our experience, we found that an AI group needs at least three distinct roles: a data engineer to organize the data, a data scientist to investigate the data and a software engineer to implement applications." "In our approach to AI, we currently see three parts: generating information, interpreting information and making judgment about that information," says Martin Fiore, EY Americas tax talent leader. In the past year, EY has hired over 20 professionals focused on automation and AI. "In my view, AI is the new UI," explains Elliott Yama, assistant vice president of machine learning at software provider Apttus.