Goto

Collaborating Authors

AAAI 2020 A Turning Point for Deep Learning? Hinton, LeCun, and Bengio Might Have Different Approaches

#artificialintelligence

This is an updated version. The Godfathers of AI and 2018 ACM Turing Award winners Geoffrey Hinton, Yann LeCun, and Yoshua Bengio shared a stage in New York on Sunday night at an event organized by the Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI 2020). The trio of researchers have made deep neural networks a critical component of computing, and in individual talks and a panel discussion they discussed their views on current challenges facing deep learning and where it should be heading. Introduced in the mid 1980s, deep learning gained traction in the AI community the early 2000s. The year 2012 saw the publication of the CVPR paper Multi-column Deep Neural Networks for Image Classification, which showed how max-pooling CNNs on GPUs could dramatically improve performance on many vision benchmarks; while a similar system introduced months later by Hinton and a University of Toronto team won the large-scale ImageNet competition by a significant margin over shallow machine learning methods.


Self-supervised learning is the future of AI

#artificialintelligence

Despite the huge contributions of deep learning to the field of artificial intelligence, there's something very wrong with it: It requires huge amounts of data. This is one thing that both the pioneers and critics of deep learning agree on. In fact, deep learning didn't emerge as the leading AI technique until a few years ago because of the limited availability of useful data and the shortage of computing power to process that data. Reducing the data-dependency of deep learning is currently among the top priorities of AI researchers. In his keynote speech at the AAAI conference, computer scientist Yann LeCun discussed the limits of current deep learning techniques and presented the blueprint for "self-supervised learning," his roadmap to solve deep learning's data problem.


Self-supervised learning is the future of AI

#artificialintelligence

Despite the huge contributions of deep learning to the field of artificial intelligence, there's something very wrong with it: It requires huge amounts of data. This is one thing that both the pioneers and critics of deep learning agree on. In fact, deep learning didn't emerge as the leading AI technique until a few years ago because of the limited availability of useful data and the shortage of computing power to process that data. Reducing the data-dependency of deep learning is currently among the top priorities of AI researchers. In his keynote speech at the AAAI conference, computer scientist Yann LeCun discussed the limits of current deep learning techniques and presented the blueprint for "self-supervised learning," his roadmap to solve deep learning's data problem.


Self-supervised learning: The plan to make deep learning data-efficient

#artificialintelligence

This article is part of Demystifying AI, a series of posts that (try to) disambiguate the jargon and myths surrounding AI. Despite the huge contributions of deep learning to the field of artificial intelligence, there's something very wrong with it: It requires huge amounts of data. This is one thing that both the pioneers and critics of deep learning agree on. In fact, deep learning didn't emerge as the leading AI technique until a few years ago because of the limited availability of useful data and the shortage of computing power to process that data. Reducing the data-dependency of deep learning is currently among the top priorities of AI researchers.


Yann LeCun and Yoshua Bengio: Self-supervised learning is the key to human-level intelligence

#artificialintelligence

Self-supervised learning could lead to the creation of AI that's more human-like in its reasoning, according to Turing Award winners Yoshua Bengio and Yann LeCun. Bengio, director at the Montreal Institute for Learning Algorithms, and LeCun, Facebook VP and chief AI scientist, spoke candidly about this and other research trends during a session at the International Conference on Learning Representation (ICLR) 2020, which took place online. Supervised learning entails training an AI model on a labeled data set, and LeCun thinks it'll play a diminishing role as self-supervised learning comes into wider use. Instead of relying on annotations, self-supervised learning algorithms generate labels from data by exposing relationships among the data's parts, a step believed to be critical to achieving human-level intelligence. "Most of what we learn as humans and most of what animals learn is in a self-supervised mode, not a reinforcement mode. It's basically observing the world and interacting with it a little bit, mostly by observation in a test-independent way," said LeCun.