To learn who's taking home the Turing Award, people might turn to their trusted talking bots, like Siri or Alexa. Or, in fact, some of the very technology the three winners helped bring to life. Yoshua Bengio, Geoffrey Hinton and Yann LeCun have earned what's often referred to as the Nobel Prize of the tech world for their pioneering work in artificial intelligence, the Association for Computing Machinery announced Wednesday. The researchers, working both independently and together, helped advance the thinking and application of neural networks, the technology that gives computers the ability to recognize patterns, interpret language and glean insights from complex data. "Artificial intelligence is now one of the fastest-growing areas in all of science and one of the most talked-about topics in society," Cherri Pancake, president of the computing society, said in a statement.
The 2018 Turing Award, known as the "Nobel Prize of computing," has been given to a trio of researchers who laid the foundations for the current boom in artificial intelligence. Yoshua Bengio, Geoffrey Hinton, and Yann LeCun -- sometimes called the'godfathers of AI' -- have been recognized with the $1 million annual prize for their work developing the AI subfield of deep learning. The techniques the trio developed in the 1990s and 2000s enabled huge breakthroughs in tasks like computer vision and speech recognition. Their work underpins the current proliferation of AI technologies, from self-driving cars to automated medical diagnoses. In fact, you probably interacted with the descendants of Bengio, Hinton, and LeCun's algorithms today -- whether that was the facial recognition system that unlocked your phone, or the AI language model that suggested what to write in your last email.
In the late 1980s, Canadian master's student Yoshua Bengio became captivated by an unfashionable idea. A handful of artificial intelligence researchers was trying to craft software that loosely mimicked how networks of neurons process data in the brain, despite scant evidence it would work. "I fell in love with the idea that we could both understand the principles of how the brain works and also construct AI," says Bengio, now a professor at the University of Montreal. More than 20 years later, the tech industry fell in love with that idea, too. Neural networks are behind the recent bloom of progress in AI that has enabled projects such as self-driving cars and phone bots practically indistinguishable from people.
IMAGE: Yoshua Bengio, Co-recipient of the ACM A.M. Turing Award, will present his Turing Lecture at the Heidelberg Laureate Forum on September 23, 2019. ACM, the Association for Computing Machinery, today announced that Yoshua Bengio, co-recipient of the 2018 ACM A.M. Turing Award, will present his Turing Award Lecture, "Deep Learning for AI," at the Heidelberg Laureate Forum on September 23 in Heidelberg, Germany. Bengio is a professor at the University of Montreal and Scientific Director at Mila, Quebec's Artificial Intelligence Institute. He received the 2018 ACM A.M. Turing Award with Geoffrey Hinton, VP and Engineering Fellow of Google, and Yann LeCun, VP and Chief AI Scientist at Facebook. Bengio, Hinton and LeCun were recognized for conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing.
Deep Learning has benefited primarily and continues to do so thanks to the pioneering works of Geoff Hinton, Yann Lecun and Yoshua Bengio in the late 1980s. Contributions of Yann Lecun, especially in developing convolutional neural networks and their applications in computer vision and other areas of artificial intelligence form the basis of many products and services deployed across most technology companies today. Here are a few of Yann's groundbreaking research papers that have contributed greatly to this field: The ability of neural networks to generalize can be greatly enhanced by providing constraints from the task domain. As a follow up to his widely popular work on back-prop, in this paper, Yann and his peers demonstrate how such constraints can be integrated into a backpropagation network through the architecture of the network. This approach has been successfully applied to the recognition of handwritten zip code digits provided by the US Postal Service.