Vector Institute


Progress in AI seems like it's accelerating, but here's why it could be plateauing

#artificialintelligence

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

@machinelearnbot

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

#artificialintelligence

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

@machinelearnbot

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.