Can artificial intelligence ever boost productivity of firms and industries the way the PC and networking did in the '80s and '90s? A big pastime of economists in the 1980s and 1990s was trying to gauge how much corporate and industrial productivity would benefit from the then-novel phenomena of personal computers, workgroup servers, and computer networking. At first it was hard to see, but in time, economists did indeed find evidence that information technology contributed to boosting economic productivity. It's too soon to expect to see data showing a similar boom from artificial intelligence, today's big IT revolution. The technology is just becoming industrialized, and many companies have yet to even try to use things such as machine learning in any significant way.
When economists talk about "superstar" anything, they're referencing a phenomenon first described in the early 1980s. It began as the product of mass media and was put into overdrive by the internet. In an age when the reach of everything we make is greater than ever, members of an elite class of bankers, chief executives, programmers, Instagram influencers and just about anyone with in-demand technical skills have seen their incomes grow far faster than those of the middle class. In this winner-take-all economy, the superstar firms--think Apple, Google and Amazon, but also their increasingly high-tech equivalents in finance, health care and every other industry--appear to account for most of the divergence in productivity and profits between companies in the U.S. As firms cluster around talent, and talent is in turn drawn to those firms, the result is a self-reinforcing trend toward ever-richer, ever-costlier metro areas that are economically dominant over the rest of the country. While it was supposed to erase distance, it can't yet replace high-quality face-to-face communication required for rapid-fire innovation.
Europe a century ago was a global powerhouse of innovation, but it has started to lose its edge: today, despite some notable exceptions, many innovative companies are found elsewhere. Europe is falling behind in growing sectors as well as in areas of innovation such as genomics, quantum computing, and artificial intelligence, where it is being outpaced by the United States and China. A discussion paper from the McKinsey Global Institute (MGI), suggests five paths that could help the continent regain its competitive edge. The paper, Innovation in Europe: Changing the game to regain a competitive edge (PDF--395KB), focuses on ways that Europe could seek to build on its strengths rather than trying to play catch-up, given that it is hindered by fragmentation and lack of scale. This article is a condensed version of the original paper, which draws from MGI research as well as from a recent collaboration with the World Economic Forum. Given Europe's relatively high wage costs and low reliance on natural resources, innovation remains of fundamental importance for the continent's economic and social system. European companies still account for one-quarter of total industrial R&D in the world, but over the past ten years US companies have continued to increase their share, reinforcing their leadership position.
This is part 2 of a three-part series examining the effects of robots and automation on employment, based on new research from economist and Institute Professor Daron Acemoglu. Overall, adding robots to manufacturing reduces jobs -- by more than three per robot, in fact. But a new study co-authored by an MIT professor reveals an important pattern: Firms that move quickly to use robots tend to add workers to their payroll, while industry job losses are more concentrated in firms that make this change more slowly. The study, by MIT economist Daron Acemoglu, examines the introduction of robots to French manufacturing in recent decades, illuminating the business dynamics and labor implications in granular detail. "When you look at use of robots at the firm level, it is really interesting because there is an additional dimension," says Acemoglu.
Every day we hear claims that Artificial Intelligence (AI) systems are about to transform the economy, creating mass unemployment and vast monopolies. But what do professional economists think about this? Economists have been studying the relationship between technological change, productivity and employment since the beginning of the discipline with Adam Smith's pin factory. It should therefore not come as a surprise that AI systems able to behave appropriately in a growing number of situations - from driving cars to detecting tumours in medical scans - have caught their attention. In September 2017, a group of distinguished economists gathered in Toronto to set out a research agenda for the Economics of Artificial Intelligence (AI).