Vector Institute


Progress in AI seems like it's accelerating, but here's why it could be plateauing

#artificialintelligence

I'm standing in what is soon to be the center of the world, or is perhaps just a very large room on the seventh floor of a gleaming tower in downtown Toronto. Showing me around is Jordan Jacobs, who cofounded this place: the nascent Vector Institute, which opens its doors this fall and which is aiming to become the global epicenter of artificial intelligence. We're in Toronto because Geoffrey Hinton is in Toronto, and Geoffrey Hinton is the father of "deep learning," the technique behind the current excitement about AI. "In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Of the researchers at the top of the field of deep learning, Hinton has more citations than the next three combined. His students and postdocs have gone on to run the AI labs at Apple, Facebook, and OpenAI; Hinton himself is a lead scientist on the Google Brain AI team.


Is AI Riding a One-Trick Pony?

#artificialintelligence

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

#artificialintelligence

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

@machinelearnbot

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

#artificialintelligence

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Progress in AI seems like it's accelerating, but here's why it could be plateauing

@machinelearnbot

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Is AI Riding a One-Trick Pony?

@machinelearnbot

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


Is AI Riding a One-Trick Pony?

MIT Technology Review

"In 30 years we're going to look back and say Geoff is Einstein--of AI, deep learning, the thing that we're calling AI," Jacobs says. Hinton's breakthrough, in 1986, was to show that backpropagation could train a deep neural net, meaning one with more than two or three layers. A 2012 paper by Hinton and two of his Toronto students showed that deep neural nets, trained using backpropagation, beat state-of-the-art systems in image recognition. That's the bottom layer of the club sandwich: 10,000 neurons (100x100) representing the brightness of every pixel in the image.


The rise of Canadian AI requires a few critical components

#artificialintelligence

Canadian corporations shouldn't wait for AI companies to knock on their doors. But to see our Canadian AI sector lift off, we surely need a few other critical parts. Data is, of course, the feedstock of AI: the basic raw material that tech companies need to train all forms of AI technologies. Canadian corporations shouldn't wait for AI companies to knock on their doors; they need to actively look to do business with them, and commercial terms should be struck to fairly apportion ownership, upside, and risk.


Vector Institute for Artificial Intelligence ensures the world gets more Canada

#artificialintelligence

In his introduction, Jacobs offers a brief history of Canada's pioneering contribution to the field of artificial intelligence (AI), explaining the significance of the shift between rules-based AI and machine learning that originated in Ontario. The Canadian Institute for Advanced Research (CIFAR) that Jacobs refers to, approved its first program, Artificial Intelligence & Robotics in 1982, while operating out of an Ontario government office just a few blocks from where Jacobs is sitting, and later recruited Geoffrey Hinton to Toronto. A global race for the region's AI talent ensued, with alumni of Geoffrey Hinton's University of Toronto Machine Learning program going on to fill top AI R&D roles at Apple, Facebook, OpenAI and Google Brain, as well as Microsoft and Google DeepMind, among others, while Hinton himself became renowned the world over as the "godfather of deep learning." Contact the Ontario Investment Office to learn how your company can leverage the world's leading AI talent And please don't forget to follow us on Twitter.