Goto

Collaborating Authors

opération


AI And The Digital Mine

#artificialintelligence

When you think of the words "data" and "mine", no doubt the idea of data mining comes first. However, just as much as we find value in mining the rich resources of data, so too can we apply the advanced techniques for dealing with data to real-world mining -- that is, extracting natural resources from the earth. The world is just as dependent on natural resources as it is data resources, so it makes sense to see how the evolving areas of artificial intelligence and machine learning have an impact on the world of mining and natural resource extraction. Mining has always been a dangerous profession, since extracting minerals, natural gas, petroleum, and other resources requires working in conditions that can be dangerous for human life. Increasingly, we are needing to go to harsher climates such as deep under the ocean or deep inside the earth to extract the resources we still need.


What Machine Learning Trends Can We Expect for Manufacturing in 2020? ManufacturingTomorrow

#artificialintelligence

Industry 4.0, or the fourth industrial revolution, has called for a merger between automated solutions and smarter, more effective operations through the application of real-time data collection. Essentially, IoT and data-based technologies will feed real-time content into an AI platform, which will then use machine learning algorithms to analyze and extract actionable insights. Beyond that, the data solutions may support additional power systems by controlling robots or informing various processes to influence output. In other words, machine learning and AI allow for a degree of autonomy in the field like never seen before. While they are incredibly promising technologies, they're still relatively new to the industry, which means manufacturers are looking for fresh and innovative ways to apply them.


A Layman's Guide to Deep Convolutional Neural Networks

#artificialintelligence

This post is a part of a medium based'A Layman's guide to Deep Learning' series that I plan to publish in an incremental fashion. The target audience is beginners with basic programming skills; preferably Python. This post assumes you have a basic understanding of Deep Neural Networks a.k.a. A detailed post covering this has been published in the previous post -- A Layman's guide to Deep Neural Networks. Reading the previous post is highly recommended for a better understanding of this post. 'Computer Vision' as a field has evolved to new heights with the advent of deep learning.


Optimizing loops in Pandas for Enhanced Performace Machine Learning Py

#artificialintelligence

In this tutorial, you will learn different ways of optimizing loops in pandas. Pandas is one of the most popular python libraries among data scientists. While performing data analysis and data manipulation tasks in pandas, sometimes, you may want to loop/iterate over DataFrame and do some operation on each row. While this can be a simple task if the size of the data is small, it is cumbersome and very much time consuming if you have a larger data-set. So, we need to find an efficient way to loop through the pandas DataFrame.


Introduction to Linear Algebra for Applied Machine Learning with Python

#artificialintelligence

Linear algebra is to machine learning as flour to bakery: every machine learning model is based in linear algebra, as every cake is based in flour. It is not the only ingredient, of course. Machine learning models need vector calculus, probability, and optimization, as cakes need sugar, eggs, and butter. Applied machine learning, like bakery, is essentially about combining these mathematical ingredients in clever ways to create useful (tasty?) models. This document contains introductory level linear algebra notes for applied machine learning. It is meant as a reference rather than a comprehensive review. It also a good introduction for people that don't need a deep understanding of linear algebra, but still want to learn about the fundamentals to read about machine learning or to use pre-packaged machine learning solutions. Further, it is a good source for people that learned linear algebra a while ago and need a refresher. These notes are based in a series of (mostly) freely ...


Drone deliveries are making their case in a crisis

Engadget

It feels like drones were built for this moment. The coronavirus pandemic has forced everyone to spend the majority of their time indoors and, where possible, maintain a healthy distance from anyone that doesn't live in the same building. Companies have introduced numerous measures to minimize the threat and spread of infection. Countless stores have acrylic screens, for instance, and many delivery drivers leave orders at your doorstep. But a robot -- or specifically, a drone -- offers a potentially safer and quicker method of exchanging goods and services. It's no wonder, then, that so many commercial UAV (unmanned aerial vehicle) operators are flourishing at the moment.


Why the coronavirus pandemic confuses AI algorithms

#artificialintelligence

This article is part of Demystifying AI, a series of posts that (try to) disambiguate the jargon and myths surrounding AI. At some point, every one of us has had the feeling that online applications like YouTube and Amazon and Spotify seem to know us better than ourselves, recommending content that we like even before we say it. At the heart of these platforms' success are artificial intelligence algorithms--or more precisely, machine learning models--that can find intricate patterns in huge sets of data. Corporations in different sectors leverage the power of machine learning along with the availability of big data and compute resources to bring remarkable enhancement to all sorts of operations, including content recommendation, inventory management, sales forecasting, and fraud detection. Yet, despite their seemingly magical behavior, current AI algorithms are very efficient statistical engines that can predict outcomes as long as they don't deviate too much from the norm.


How to Turn Your Business into a Cognitive Enterprise with AI Technologies? Hacker Noon

#artificialintelligence

Artificial Intelligence is everywhere, opportunities are in abundance for cognitive enterprises. What do we mean by cognitive enterprises? Millions of ideas and think pieces are waiting to grow luxuriantly and cognitive AI technologies will play a bigger role in turning your ideas into a live piece of work. It is expected that AI will bring simplicity to complex business issues and deliver more useful, engaging, intuitive, and profitable solutions, and this is what we say a cognitive approach for enterprises. According to a report published by IDC a market research firm states that global spending on cognitive AI systems will reach $57.6 billion by 2021.


Artificial Intelligence : Best Human Practices and Uses SaveDelete

#artificialintelligence

Artificial Intelligence …. world's tech giants from Amazon to Alibaba, are in a race to become the world's leaders. The companies are AI trailblazers, embracing AI to next-level products and services. Here are some of the best examples of how these companies are using artificial intelligence in practice. Alphabet, Google's parent company and Waymo, self-driving technology division, started as a project at Google. Waymo wishes to bring self-driving technology to the world, today, to move people around and reduce accidents and crashes.


See Boston Dynamics' robodog herd sheep and explore in New Zealand

Mashable

Spot, the robotic "dog" design from Boston Dynamics, has had a busy pandemic, between counseling patients and enforcing social distancing guidelines. Now, a new partnership with a New Zealand robotics firm is setting up the four-legged automaton for a new line of work: farming. Technically, the partnership is much bigger than that. Rocos specializes in the remote monitoring and operation of robot fleets. By working together, the capabilities of Boston Dynamics robots like Spot will expand thanks to human operators who can manage their performance from a great distance.