Goto

Collaborating Authors

Machine Learning


IBM focuses on shortage of AI talent in IT and security

#artificialintelligence

IBM has been warning about the cybersecurity skills gap for several years now and has recently released a report on the lack of artificial intelligence (AI) skills across Europe. The company said in a Friday email to SC Media that cybersecurity has been experiencing a significant workforce and skills shortage globally, and AI can offer a crucial technology path for helping solve it. "Given that AI skillsets are not yet widespread, embedding AI into existing toolsets that security teams are already using in their daily processes will be key to overcoming this barrier," IBM stated in the email. "AI has great potential to solve some of the biggest challenges facing security teams -- from analyzing the massive amounts of security data that exists to helping resource-strapped security teams prioritize threats that pose the greatest risk, or even recommending and automating parts of the response process." Oliver Tavakoli, CTO at Vectra, said the potential of machine learning (ML) and AI materially helping in the pursuit of a large set of problems across many industries has created an acute imbalance in the supply and demand of AI talent.


AI transformer models touted to help design new drugs

#artificialintelligence

Special report AI can study chemical molecules in ways scientists can't comprehend, automatically predicting complex protein structures and designing new drugs, despite having no real understanding of science. The power to design new drugs at scale is no longer limited to Big Pharma. Startups armed with the right algorithms, data, and compute can invent tens of thousands of molecules in just a few hours. New machine learning architectures, including transformers, are automating parts of the design process, helping scientists develop new drugs for difficult diseases like Alzheimer's, cancer, or rare genetic conditions. In 2017, researchers at Google came up with a method to build increasingly bigger and more powerful neural networks.


Top 10 AI-Generated Images by DALL-E 2 - Simplified

#artificialintelligence

OpenAI, a San Francisco Artificial Intelligence company closely affiliated with Microsoft, launched an A.I. system and neural network in January 2021 known as DALL-E. Named using a pun of the surrealist artist Salvador Dalí and Pixar's famous movie WALL-E, DALL-E creates images from text.In this blog, we'll let you in on everything you should know about DALL-E, its variation DALL-E 2, and share ten of the most creative AI-generated images of Dall-E 2. Picture of a dog wearing a beret and a turtleneck generated by the DALL-E 2 image generation software. Now, you may be wondering what DALL-E is all about. It's an AI tool that takes a description of an object or a scene and automatically produces an image depicting the scene/object. DALL-E also allows you to edit all the wonderful AI-generated images you've created with simple tools and text modifications.


Advanced Data Science with IBM

#artificialintelligence

Apache Spark is the de-facto standard for large scale data processing. This is the first course of a series of courses towards the IBM Advanced Data Science Specialization. We strongly believe that is is crucial for success to start learning a scalable data science platform since memory and CPU constraints are to most limiting factors when it comes to building advanced machine learning models. In this course we teach you the fundamentals of Apache Spark using python and pyspark. We'll introduce Apache Spark in the first two weeks and learn how to apply it to compute basic exploratory and data pre-processing tasks in the last two weeks.


Teach yourself data science at your own pace for less than $40

#artificialintelligence

The following content is brought to you by ZDNet partners. If you buy a product featured here, we may earn an affiliate commission or other compensation. Artificial intelligence (AI) has become so commonplace that it's easy to forget it was once a science fiction pipe dream. But AI and the machine learning concepts behind it are still new enough that programmers and data scientists will be in demand for the foreseeable future. So if you want to pursue a career in one of the fields where data science know-how is essential, this e-learning bundle can serve as a great first step.


Advanced Reinforcement Learning: policy gradient methods

#artificialintelligence

Sample efficiency for policy gradient methods is pretty poor. We throw out each batch of data immediately after just one gradient step. This is the most complete Reinforcement Learning course series on Udemy. In it, you will learn to implement some of the most powerful Deep Reinforcement Learning algorithms in Python using PyTorch and PyTorch lightning. You will implement from scratch adaptive algorithms that solve control tasks based on experience.


Deep Learning with PyTorch

#artificialintelligence

Deep Learning with PyTorch teaches you to create neural networks and deep learning systems with PyTorch. This program is specially designed for people who want to start using PyTorch for building AI, Machine Learning, or Deep Learning models and applications. This program will help you learn how PyTorch can be used for developing deep learning models. You'll learn the PyTorch concepts like Tensors, Autograd, and Automatic differentiation packages. Also, this program will give you a brief about deep learning concepts.


ep.351: Early Days of ICRA Competitions, with Bill Smart

Robohub

Bill Smart, Professor of Mechanical Engineering and Robotics at Oregon State University, helped start competitions as part of ICRA. In this episode, Bill dives into the high-level decisions involved with creating a meaningful competition. The conversation explores how competitions are there to showcase research, potential ideas for future competitions, the exciting phase of robotics we are currently in, and the intersection of robotics, ethics, and law. Dr. Smart does research in the areas of robotics and machine learning. In robotics, Smart is particularly interested in improving the interactions between people and robots; enabling robots to be self-sufficient for weeks and months at a time; and determining how they can be used as personal assistants for people with severe motor disabilities.


Data Centers Need to Go Green - And AI Can Help

#artificialintelligence

Climate change is here, and it's set to get much worse, experts say – and as a result, many industries have pledged to reduce their carbon footprints in the coming decades. Now, the recent jump in energy prices due mainly to the war in Ukraine, also emphasizes the need for development of cheap, renewable forms of energy from freely available sources, like the sun and wind – as opposed to reliance on fossil fuels controlled by nation-states. But going green is easier for some industries than for others,- and one area where it is likely to be a significant challenge is in data centers, which require huge amounts of electricity to cool off, in some cases, the millions of computers deployed. Growing consumer demand to reduce carbon output, along with rules that regulators are likely to impose in the near future, require companies that run data centers to take immediate steps to go green. And artificial intelligence, machine learning, neural networks, and other related technologies can help enterprises of all kinds achieve that goal, without having to spend huge sums to accomplish it.


Traditional vs Deep Learning Algorithms in the Telecom Industry -- Cloud Architecture and Algorithm Categorization

#artificialintelligence

The unprecedented growth of mobile devices, applications and services have placed the utmost demand on mobile and wireless networking infrastructure. Rapid research and development of 5G systems have found ways to support mobile traffic volumes, real-time extraction of fine-grained analytics, and agile management of network resources, so as to maximize user experience. Moreover inference from heterogeneous mobile data from distributed devices experiences challenges due to computational and battery power limitations. ML models employed at the edge-servers are constrained to light-weight to boost model performance by achieving a trade-off between model complexity and accuracy. Also, model compression, pruning, and quantization are largely in place.