The Godfathers of the AI Boom Win Computing's Highest Honor

#artificialintelligence

In the late 1980s, Canadian master's student Yoshua Bengio became captivated by an unfashionable idea. A handful of artificial intelligence researchers was trying to craft software that loosely mimicked how networks of neurons process data in the brain, despite scant evidence it would work. "I fell in love with the idea that we could both understand the principles of how the brain works and also construct AI," says Bengio, now a professor at the University of Montreal. More than 20 years later, the tech industry fell in love with that idea, too. Neural networks are behind the recent bloom of progress in AI that has enabled projects such as self-driving cars and phone bots practically indistinguishable from people.


'Godfathers of AI' Receive Turing Award, the Nobel Prize of Computing - AI Trends

#artificialintelligence

The 2018 Turing Award, known as the "Nobel Prize of computing," has been given to a trio of researchers who laid the foundations for the current boom in artificial intelligence. Yoshua Bengio, Geoffrey Hinton, and Yann LeCun -- sometimes called the'godfathers of AI' -- have been recognized with the $1 million annual prize for their work developing the AI subfield of deep learning. The techniques the trio developed in the 1990s and 2000s enabled huge breakthroughs in tasks like computer vision and speech recognition. Their work underpins the current proliferation of AI technologies, from self-driving cars to automated medical diagnoses. In fact, you probably interacted with the descendants of Bengio, Hinton, and LeCun's algorithms today -- whether that was the facial recognition system that unlocked your phone, or the AI language model that suggested what to write in your last email.


Reaching New Heights with Artificial Neural Networks

Communications of the ACM

Once treated by the field with skepticism (if not outright derision), the artificial neural networks that 2018 ACM A.M. Turing Award recipients Geoffrey Hinton, Yann LeCun, and Yoshua Bengio spent their careers developing are today an integral component of everything from search to content filtering. Here, the three researchers share what they find exciting, and which challenges remain. There's so much more noise now about artificial intelligence than there was when you began your careers--some of it well-informed, some not. What do you wish people would stop asking you? GEOFFREY HINTON: "Is this just a bubble?"


Apple and Its Rivals Bet Their Futures on These Men's Dreams

#artificialintelligence

Over the past five years, artificial intelligence has gone from perennial vaporware to one of the technology industry's brightest hopes. Computers have learned to recognize faces and objects, understand the spoken word, and translate scores of languages. Apple, Facebook, and Microsoft--have bet their futures largely on AI, racing to see who's fastest at building smarter machines. That's fueled the perception that AI has come out of nowhere, what with Tesla's self-driving cars and Alexa chatting up your child. But this was no overnight hit, nor was it the brainchild of a single Silicon Valley entrepreneur. The ideas behind modern AI--neural networks and machine learning--have roots you can trace to the last stages of World War II. Back then, academics were beginning to build computing systems meant to store and process information in ways similar to the human brain. Over the decades, the technology had its ups and downs, but it failed to capture the attention of computer scientists broadly until around 2012, thanks to a handful of stubborn researchers who weren't afraid to look foolish. They remained convinced that neural nets would light up the world and alter humanity's destiny.


Welcome to the AI Conspiracy: The 'Canadian Mafia' Behind Tech's Latest Craze

#artificialintelligence

In the late '90s, Tomi Poutanen, a precocious computer whiz from Finland, hoped to do his dissertation on neural networks, a scientific method aimed at teaching computers to act and think like humans. As a student at the University of Toronto, it was a logical choice. Geoffrey Hinton, the godfather of neural network research, taught and ran a research lab there. But instead of encouraging Poutanen, who went on to work at Yahoo and recently co-founded media startup Milq, one of his professors sent a stern warning about taking the academic path known as deep learning. "Smart scientists," his professor cautioned, "go there to see their careers end."