Goto

Collaborating Authors

 Hinton


Progress in AI seems like it's accelerating, but here's why it could be plateauing

#artificialintelligence

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

@machinelearnbot

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

#artificialintelligence

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

@machinelearnbot

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


The 'Godfather of AI' on making machines clever and whether robots really will learn to kill us all?

#artificialintelligence

His former students have now been poached by Silicon Valley to lead AI research at the likes of Apple, Facebook and Google (which has also appointed him a vice president engineering fellow). Hinton is a pioneer of something called machine learning which enables computers to come up with programmes to solve problems themselves. In particular, he has devised a subset of machine learning called "deep learning" whereby neural networks modelled on those that form the human brain enable machines to learn in the same way a toddler does. Through the work of Hinton and his colleagues – dubbed by their rivals the "Canadian Mafia" – the potential of machine learning has become limitless.


songrotek/Deep-Learning-Papers-Reading-Roadmap

#artificialintelligence

If you are a newcomer to the Deep Learning area, the first question you may have is "Which paper should I start reading from?" After reading above papers, you will have a basic understanding of the Deep Learning history, the basic architectures of Deep Learning model(including CNN, RNN, LSTM) and how deep learning can be applied to image and speech recognition issues. The following papers will take you in-depth understanding of the Deep Learning method, Deep Learning in different areas of application and the frontiers. "Deep compression: Compressing deep neural network with pruning, trained quantization and huffman coding."