Is AI research headed in the right direction?

#artificialintelligence

This article is part of Demystifying AI, a series of posts that (try to) disambiguate the jargon and myths surrounding artificial intelligence. In 1956, researchers at Dartmouth College coined the term "artificial intelligence," a field of science that aims to enable machines to replicate the capabilities of the human mind. AI pioneers believed at the time that in short time, "machines will be capable… of doing any work a man can do." For decades, AI scientists and researchers have been trying to recreate the logic and functionalities of the human brain. And for decades, they have dismayed themselves and the general public.



Everything You Ever Wanted to Know About Artificial Intelligence

#artificialintelligence

Artificial intelligence is overhyped--there, we said it. Superintelligent algorithms aren't about to take all the jobs or wipe out humanity. But software has gotten significantly smarter of late. It's why you can talk to your friends as an animated poop on the iPhone X using Apple's Animoji, or ask your smart speaker to order more paper towels. Tech companies' heavy investments in AI are already changing our lives and gadgets, and laying the groundwork for a more AI-centric future.


Everything You Ever Wanted to Know About Artificial Intelligence

#artificialintelligence

Artificial intelligence is overhyped--there, we said it. Superintelligent algorithms aren't about to take all the jobs or wipe out humanity. But software has gotten significantly smarter of late. It's why you can talk to your friends as an animated poop on the iPhone X using Apple's Animoji, or ask your smart speaker to order more paper towels. Tech companies' heavy investments in AI are already changing our lives and gadgets, and laying the groundwork for a more AI-centric future.


What Is Deep Learning?

#artificialintelligence

Deep learning is a subset of machine learning, a branch of artificial intelligence that configures computers to perform tasks through experience. Deep learning, an advanced artificial intelligence technique, has become increasingly popular in the past few years, thanks to abundant data and increased computing power. It's the main technology behind many of the applications we use every day, including online language translation and automated face-tagging in social media. This technology has also proved useful in healthcare: Earlier this year, computer scientists at the Massachusetts Institute of Technology (MIT) used deep learning to create a new computer program for detecting breast cancer. Classic models had required engineers to manually define the rules and logic for detecting cancer, but for this new model, the scientists gave a deep-learning algorithm 90,000 full-resolution mammogram scans from 60,000 patients and let it find the common patterns between scans of patients who ended up with breast cancer and those who didn't.