Goto

Collaborating Authors

Industry


LafargeHolcim launches Industry 4.0 for cement production – Australian Bulk Handling Review

#artificialintelligence

LafargeHolcim will implement automation and robotics, artificial intelligence, predictive maintenance and digital twin technologies for its production process. The company is upgrading its production fleet for the future through its'Plants of Tomorrow" program. The program will be rolled out over four years as LafargeHolcim upgrades its technologies in the building materials industry. The company predicts a "Plants of Tomorrow" certified operation will show 15 to 20 percent of operational efficiency gains compared to a conventional cement plant. Among the technologies implemented are predictive operations that can detect abnormal conditions and process anomalies in real-time. This aims to reduce maintenance costs by more than 10 percent and significantly lower energy costs. Digital twins of plants will also be created to optimise training opportunities. Automation and robotics is another important element of the strategy. Unmanned surveillance is being performed for high exposure jobs in the entire plant. Partnering with Swiss start-up Flyability, the company is using drones that allow the frequency of inspections to increase while simultaneously reducing cost and increasing safety for employees by inspecting confined spaces. In addition, the new PACT (Performance and Collaboration) digital tool allows operational decision making from experience-based to data-centric, by combining data from various sources and enabling machine learning applications. LafargeHolcim is currently working on more than 30 pilot projects covering all regions where the company is active. The first integrated cement plant will be at LafargeHolcim's premises in Siggenthal, Switzerland, this plant will test all modules of the'Plants of Tomorrow' program. LafargeHolcim Global Head Cement Manufacturing, Solomon Baumgartner Aviles, said transforming the way we produce cement is one of the focus areas of our digitalisation strategy and the'Plants of Tomorrow' initiative will turn Industry 4.0 into reality at our plants. "These innovative solutions make cement production safer, more efficient and environmentally fit.


What is deep learning, Meaning And work?

#artificialintelligence

Deep learning is a sub-field of machine learning and an aspect of artificial intelligence. To understand this more easily, understand that it is meant to emulate the learning approach that humans use to acquire certain types of knowledge. This is somewhat different from machine learning, often people get confused in this and machine learning. Deep learning uses a sequencing algorithm while machine learning uses a linear algorithm. To understand this more accurately, understand this example that if a child is identified with a flower, then he will ask again and again, is this flower?


How AI & Data Analytics Is Impacting Indian Legal System

#artificialintelligence

In a survey conducted by Gurugram-based BML Munjal University (School of Law) in July 2020, it was found that about 42% of lawyers believed that in the next 3 to 5 years as much as 20% of regular, day-to-day legal works could be performed with technologies such as artificial intelligence. The survey also found that about 94% of law practitioners favoured research and analytics as to the most desirable skills in young lawyers. Earlier this year, Chief Justice of India SA Bobde, in no uncertain terms, underlined that the Indian judiciary must equip itself with incorporating artificial intelligence in its system, especially in dealing with document management and cases of repetitive nature. With more industries and professional sectors embracing AI and data analytics, the legal industry, albeit in a limited way, is no exception. According to the 2020 report of the National Judicial Data Grid, over the last decade, 3.7 million cases were pending across various courts in India, including high courts, district and taluka courts.



Artificial Intelligence in Medicine Market to Witness Robust Expansion by 2026 with Top Key …

#artificialintelligence

Artificial Intelligence in Medicine Market research is an intelligence report with meticulous efforts undertaken to study the right and valuable …


Four steps to accelerate the journey to machine learning - SiliconANGLE

#artificialintelligence

A good example of solving for the right problems can be seen in Formula One World Championship Ltd. The motorsport company was looking for new ways to deliver race metrics that could change the way fans and teams experience racing, but had more than 65 years of historical race data to sift through. After aligning their technical and domain experts to determine what type of untapped data had the most potential to deliver value for its teams and fans, Formula 1 data scientists then used Amazon SageMaker to train deep learning models on this historical data to extract critical performance statistics, make race predictions and relay engaging insights to their fans into the split-second decisions and strategies adopted by teams and drivers.


How to Train Your First Deep Learning Model

#artificialintelligence

This will be an interactive post using Google Colab notebooks. If you have not used Google Colab before, there is a quick-start tutorial at tutorialspoint. You can access the notebook at this link: Train your first DL model. First, make a copy and save it into your Drive so that you can access it and make changes. Next, make sure the runtime is set to GPU so you can make use of the free resources provided by Google.


New AI Paradigm May Reduce a Heavy Carbon Footprint

#artificialintelligence

Artificial intelligence (AI) machine learning can have a considerable carbon footprint. Deep learning is inherently costly, as it requires massive computational and energy resources. Now researchers in the U.K. have discovered how to create an energy-efficient artificial neural network without sacrificing accuracy and published the findings in Nature Communications on August 26, 2020. The biological brain is the inspiration for neuromorphic computing--an interdisciplinary approach that draws upon neuroscience, physics, artificial intelligence, computer science, and electrical engineering to create artificial neural systems that mimic biological functions and systems. The human brain is a complex system of roughly 86 billion neurons, 200 billion neurons, and hundreds of trillions of synapses.


WHAT IS EDGE AI…?

#artificialintelligence

These days we are hearing a lot about AI, but have you ever heard about EDGE AI ..? What does it mean and what is it used for? Network edge or edge, where data resides and collected. Edge computing processes data on local places like computers, IoT devices or Edge servers, here we are doing computation to a network edge which indeed reduces long-distance communication between client and server. Edge AI, where AI algorithms will locally process sensor data or signals that are created on hardware devices in less than a few milliseconds by providing real-time information. Most of the time the AI algorithms are being processed in cloud data centers with deep learning models, which consume heavy compute capacity.


Pinaki Laskar posted on LinkedIn

#artificialintelligence

The #AI value chain, 1) AI chip and hardware makers who are looking to power all the AI applications that will be woven into the fabric of organisations big and small globally 2) The #cloud platform and infrastructure providers who will host the AI applications 3) The AI #algorithms and cognitive services building block makers who provide the vision recognition, speech and #deeplearning predictive models to power AI applications 4) Enterprise solution providers whose software is used in customer, HR, and asset management and planning applications 5) Industry vertical solution providers who are looking to use AI to power companies across sectors such as healthcare to finance 6) Corporate takers of AI who are looking to increase revenues, drive efficiencies and deepen their insights The today's AI is presented by what the BigTech and global social media platforms are pushing, it's Narrow /Weak AI /ML /DL, as "Cloud DL/AI Platforms". But this #Machinelearning algorithms are designed to optimize for a cost/loss function, having no intelligence, understanding or reasoning. So it is, Most curve-fitting AI tools available today sold as focused on predicting, identifying, or classifying things, a rote "learning from data/experience".