Goto

Collaborating Authors

Machine Learning


Advanced Data Science with IBM

#artificialintelligence

Apache Spark is the de-facto standard for large scale data processing. This is the first course of a series of courses towards the IBM Advanced Data Science Specialization. We strongly believe that is is crucial for success to start learning a scalable data science platform since memory and CPU constraints are to most limiting factors when it comes to building advanced machine learning models. In this course we teach you the fundamentals of Apache Spark using python and pyspark. We'll introduce Apache Spark in the first two weeks and learn how to apply it to compute basic exploratory and data pre-processing tasks in the last two weeks.


Teach yourself data science at your own pace for less than $40

#artificialintelligence

The following content is brought to you by ZDNet partners. If you buy a product featured here, we may earn an affiliate commission or other compensation. Artificial intelligence (AI) has become so commonplace that it's easy to forget it was once a science fiction pipe dream. But AI and the machine learning concepts behind it are still new enough that programmers and data scientists will be in demand for the foreseeable future. So if you want to pursue a career in one of the fields where data science know-how is essential, this e-learning bundle can serve as a great first step.


Advanced Reinforcement Learning: policy gradient methods

#artificialintelligence

Sample efficiency for policy gradient methods is pretty poor. We throw out each batch of data immediately after just one gradient step. This is the most complete Reinforcement Learning course series on Udemy. In it, you will learn to implement some of the most powerful Deep Reinforcement Learning algorithms in Python using PyTorch and PyTorch lightning. You will implement from scratch adaptive algorithms that solve control tasks based on experience.


Deep Learning with PyTorch

#artificialintelligence

Deep Learning with PyTorch teaches you to create neural networks and deep learning systems with PyTorch. This program is specially designed for people who want to start using PyTorch for building AI, Machine Learning, or Deep Learning models and applications. This program will help you learn how PyTorch can be used for developing deep learning models. You'll learn the PyTorch concepts like Tensors, Autograd, and Automatic differentiation packages. Also, this program will give you a brief about deep learning concepts.


ep.351: Early Days of ICRA Competitions, with Bill Smart

Robohub

Bill Smart, Professor of Mechanical Engineering and Robotics at Oregon State University, helped start competitions as part of ICRA. In this episode, Bill dives into the high-level decisions involved with creating a meaningful competition. The conversation explores how competitions are there to showcase research, potential ideas for future competitions, the exciting phase of robotics we are currently in, and the intersection of robotics, ethics, and law. Dr. Smart does research in the areas of robotics and machine learning. In robotics, Smart is particularly interested in improving the interactions between people and robots; enabling robots to be self-sufficient for weeks and months at a time; and determining how they can be used as personal assistants for people with severe motor disabilities.


Data Centers Need to Go Green - And AI Can Help

#artificialintelligence

Climate change is here, and it's set to get much worse, experts say – and as a result, many industries have pledged to reduce their carbon footprints in the coming decades. Now, the recent jump in energy prices due mainly to the war in Ukraine, also emphasizes the need for development of cheap, renewable forms of energy from freely available sources, like the sun and wind – as opposed to reliance on fossil fuels controlled by nation-states. But going green is easier for some industries than for others,- and one area where it is likely to be a significant challenge is in data centers, which require huge amounts of electricity to cool off, in some cases, the millions of computers deployed. Growing consumer demand to reduce carbon output, along with rules that regulators are likely to impose in the near future, require companies that run data centers to take immediate steps to go green. And artificial intelligence, machine learning, neural networks, and other related technologies can help enterprises of all kinds achieve that goal, without having to spend huge sums to accomplish it.


Traditional vs Deep Learning Algorithms in the Telecom Industry -- Cloud Architecture and Algorithm Categorization

#artificialintelligence

The unprecedented growth of mobile devices, applications and services have placed the utmost demand on mobile and wireless networking infrastructure. Rapid research and development of 5G systems have found ways to support mobile traffic volumes, real-time extraction of fine-grained analytics, and agile management of network resources, so as to maximize user experience. Moreover inference from heterogeneous mobile data from distributed devices experiences challenges due to computational and battery power limitations. ML models employed at the edge-servers are constrained to light-weight to boost model performance by achieving a trade-off between model complexity and accuracy. Also, model compression, pruning, and quantization are largely in place.


Natural Language Processing in TensorFlow

#artificialintelligence

If you are a software developer who wants to build scalable AI-powered algorithms, you need to understand how to use the tools to build them. This Specialization will teach you best practices for using TensorFlow, a popular open-source framework for machine learning. In Course 3 of the deeplearning.ai TensorFlow Specialization, you will build natural language processing systems using TensorFlow. You will learn to process text, including tokenizing and representing sentences as vectors, so that they can be input to a neural network.


A Beginner's Guide to AutoML - Solita Data

#artificialintelligence

Automated Machine Learning (AutoML) is a concept that provides the means to utilise existing data and create models for non-Machine Learning experts. In addition to that, AutoML provides Machine Learning (ML) professionals ways to develop and use effective models without spending time on tasks such as data cleaning and preprocessing, feature engineering, model selection, hyperparameter tuning, etc. Before we move any further, it is important to note that AutoML is not some system that has been developed by a single entity. Several organisations have developed their own AutoML packages. These packages cover a broad area, and targets people at different skill levels.


3 + 1 ways of running R on Amazon SageMaker

#artificialintelligence

The R programming language is one of the most commonly used languages in the scientific space, being one of the most commonly used languages for machine learning (probably second following python) and arguably the most popular language amongst mathematicians and statisticians. It is easy to get started with, free to use, with support for many scientific and visualisation libraries. While R can help you analyse your data, the more data you have the more compute power you require and the more impactful your analysis is, the more repeatability and reproducibility is required. Analysts and Data Scientists need to find ways to fulfil such requirements. In this post we briefly describe the main ways of running your R workloads on the cloud, making use of Amazon SageMaker, the end-to-end Machine Learning cloud offering of AWS.