Goto

Collaborating Authors

Oil & Gas


How to plot a box plot using the pandas Python library? - The Security Buddy

#artificialintelligence

Using a box plot, one can know the spread and skewness of data. It is a standardized way of displaying the five-number summary of the data: The minimum The maximum The median The first quartile or 25th percentile and The third quartile or 75th percentile A box plot usually includes two parts. It includes a […]


Is Artificial Intelligence A Net-Positive For Carbon Emissions?

#artificialintelligence

The almost unfathomable scale of the energy transition required for rapid decarbonization: in the energy sector alone, reaching net-zero greenhouse gas emissions will require infrastructure investments costing between $92 trillion and $173 trillion of by 2050, according to estimates by BloombergNEF. AI has a massive role to play here, as "even small gains in flexibility, efficiency or capacity in clean energy and low-carbon industry can therefore lead to trillions in value and savings."


Why a Social License is Needed for AI

#artificialintelligence

If business wants to use AI at scale, adhering to the technical guidelines for responsible AI development isn't enough. It must obtain society's explicit approval to deploy the technology. Six years ago, in March 2016, Microsoft Corporation launched an experimental AI-based chatbot, TayTweets, whose Twitter handle was @TayandYou. Tay, an acronym for "thinking about you," mimicked a 19-year-old American girl online, so the digital giant could showcase the speed at which AI can learn when it interacts with human beings. Living up to its description as "AI with zero chill," Tay started off replying cheekily to Twitter users and turning photographs into memes. Some topics were off limits, though; Microsoft had trained Tay not to comment on societal issues such as Black Lives Matter. Soon enough, a group of Twitter users targeted Tay with a barrage of tweets about controversial issues such as the Holocaust and Gamergate. They goaded the chatbot into replying with racist and sexually charged responses, exploiting its repeat-after-me capability. Realizing that Tay was reacting like IBM's Watson, which started using profanity after perusing the online Urban Dictionary, Microsoft was quick to delete the first inflammatory tweets. Less than 16 hours and more than 100,000 tweets later, the digital giant shut down Tay.


Al Gore explains global AI program that is spying on thousands of facilities to monitor emissions

FOX News

Former Vice President Al Gore on Thursday outlined a global effort run by "machine-learning" artificial intelligence is essentially spying on individual facilities in every country in the world to measure their emissions of greenhouse gases and target the world's largest emitters. At the World Economic Forum in Davos, Switzerland, Gore formally introduced attendees to the initiative known as Climate Tracking Real-Time Atmospheric Carbon Emissions, or Climate TRACE. The initiative has led to a website that allows for real-time tracking of emissions in any area of the world, which Gore said is allowing climate activists, reporters and others to identify high-priority industries and regions for emissions reduction programs. "It's a non-profit coalition that uses artificial intelligence to process data from 300 existing satellites and from 30,000 land, sea and air base sensors and multiple internet data streams to use artificial intelligence to create machine-learning algorithms to zoom in on every single significant source of greenhouse gas (GHG) pollution," he said of Climate TRACE. Gore showed how Climate TRACE uses these inputs to zoom in on specific facilities and assess how much they contribute to GHG emissions.


Computers that power self-driving cars could be a huge driver of global carbon emissions

#artificialintelligence

In the future, the energy needed to run the powerful computers on board a global fleet of autonomous vehicles could generate as many greenhouse gas emissions as all the data centers in the world today. That is one key finding of a new study from MIT researchers that explored the potential energy consumption and related carbon emissions if autonomous vehicles are widely adopted. The data centers that house the physical computing infrastructure used for running applications are widely known for their large carbon footprint: They currently account for about 0.3 percent of global greenhouse gas emissions, or about as much carbon as the country of Argentina produces annually, according to the International Energy Agency. Realizing that less attention has been paid to the potential footprint of autonomous vehicles, the MIT researchers built a statistical model to study the problem. They determined that 1 billion autonomous vehicles, each driving for one hour per day with a computer consuming 840 watts, would consume enough energy to generate about the same amount of emissions as data centers currently do.


Using machine learning to forecast amine emissions

AIHub

Global warming is partly due to the vast amount of carbon dioxide that we release, mostly from power generation and industrial processes, such as making steel and cement. For a while now, chemical engineers have been exploring carbon capture, a process that can separate carbon dioxide and store it in ways that keep it out of the atmosphere. This is done in dedicated carbon-capture plants, whose chemical process involves amines, compounds that are already used to capture carbon dioxide from natural gas processing and refining plants. Amines are also used in certain pharmaceuticals, epoxy resins, and dyes. The problem is that amines could also be potentially harmful to the environment as well as a health hazard, making it essential to mitigate their impact.


Weatherford Signs Agreement With DataRobot To Advance AI Capabilities

#artificialintelligence

Weatherford International signed a multiyear agreement with artificial-intelligence (AI) company DataRobot to deliver advanced AI in its digital platforms, including the ForeSite production optimization and Centro well construction platforms. By forging this new relationship with DataRobot, Weatherford plans to accelerate the development of machine learning (ML) and AI-enabled offerings within its digital solutions portfolio to deliver innovative technologies to the market. Providing an integrated solution combining physics-based and AI models at scale enables understanding and leveraging large quantities of data from every corner of an asset to improve operations performance. "We began our Industry 4.0 journey in 2017 by introducing our first AI/ML-based modules in our software platforms," said Matt Foder, Weatherford's senior vice president of innovation and new energy. "This agreement with DataRobot adds a solid foundation to operationalize and scale these modules and those of our customers, providing incremental value across the energy industry space. This collaborative innovation is aligned with our promise of delivering open and flexible digital platforms to our users."


Why we need new stories on climate Rebecca Solnit

The Guardian

Every crisis is in part a storytelling crisis. This is as true of climate chaos as anything else. We are hemmed in by stories that prevent us from seeing, or believing in, or acting on the possibilities for change. Some are habits of mind, some are industry propaganda. Sometimes, the situation has changed but the stories haven't, and people follow the old versions, like outdated maps, into dead ends. We need to leave the age of fossil fuel behind, swiftly and decisively. But what drives our machines won't change until we change what drives our ideas. The visionary organiser adrienne maree brown wrote not long ago that there is an element of science fiction in climate action: "We are shaping the future we long for and have not yet experienced. I believe that we are in an imagination battle."


Spectroscopy and Chemometrics Machine-Learning News Weekly #1, 2023 – [:en]NIR Calibration Model[:de]NIR Calibration Model[:it]Modelli di Calibrazione NIR

#artificialintelligence

Get the Spectroscopy and Chemometrics News Weekly in real time on Twitter @ CalibModel and follow us. "Foods: Prediction Models for the Content of Calcium, Boron and Potassium in the Fruit of'Huangguan' Pears Established by Using Near-Infrared Spectroscopy" LINK "Construction and Application of Detection Model for Leucine and Tyrosine Content in Golden Tartary Buckwheat Based on Near Infrared Spectroscopy" LINK "Rapid recognition of different sources of methamphetamine drugs based on hand-held near infrared spectroscopy and multi-layer-extreme learning machine algorithms" LINK "Rapid determination of viscosity and viscosity index of lube base oil based on near-infrared spectroscopy and new transformation formula" LINK "Simple dilated convolutional neural network for quantitative modeling based on near infrared spectroscopy techniques" LINK "Fast and nondestructive discrimination of fresh tea leaves at different altitudes based on near infrared spectroscopy and various chemometrics methods" LINK "NIR spectroscopy combined with 1D-convolutional neural network for breast cancerization analysis and diagnosis" LINK "Associations between visceral adipose tissue estimates produced by near-infrared spectroscopy, mobile anthropometrics, and traditional body composition …" LINK "Discrimination of Minced Mutton Adulteration Based on Sized-Adaptive Online NIRS Information and 2D Conventional Neural Network. "Fruit detection research based on near-infrared spectroscopy and lightweight neural network" LINK "Honey quality detection based on near-infrared spectroscopy" LINK "Evaluation of the potential of near infrared hyperspectral imaging for monitoring the invasive brown marmorated stink bug" LINK "Denoising stacked autoencodersbased nearinfrared quality monitoring method via robust samples evaluation" LINK "Visualization research of egg freshness based on hyperspectral imaging and binary competitive adaptive reweighted sampling" LINK "Desert Soil Salinity Inversion Models Based on Field In Situ Spectroscopy in Southern Xinjiang, China" LINK "Novel broad spectral response perovskite solar cells: A review of the current status and advanced strategies for breaking the theoretical limit efficiency" LINK "Remote Sensing: Estimation of Potato Above-Ground Biomass Based on Vegetation Indices and Green-Edge Parameters Obtained from UAVs" LINK "Prognostic value of syntax score, intravascular ultrasound and near-infrared spectroscopy to identify low-risk patients with coronary artery disease 5-year …" LINK


Data Scientist at Project Canary, PBC - Denver, Colorado, United States

#artificialintelligence

Project Canary is a SaaS-based data analytics company focused on environmental performance or the E in ESG for energy and additional business sectors. We are the leaders in assessing and scoring responsible operations and provide independent, measured emission profiles, including methane, via high fidelity continuous monitoring technology that helps companies take ESG action. Formed as a Public Benefit Corporation (B-Corp rating score 107), Project Canary's Denver-based team of technologists, engineers, and seasoned industry operators have earned recognition for their uncompromising standards and high-fidelity data. Project Canary's mission is to fight climate change and put actionable insights starting with the energy sector. We ingest data from various sources, including our own proprietary environmental sensors/hardware, to calculate carbon emissions from different facilities in real-time via SaaS.