World leaders have woken up to the potential of artificial intelligence (AI) over the past year. Billions of dollars in governmental funding have been announced, dozens of hearings have been held, and nearly 20 national plans have been adopted. In the past couple of months alone, Canada and France launched the International Panel on AI (IPAI), modeled on the Intergovernmental Panel on Climate Change, to examine the global impacts of AI; the U.S. Congress created a National Security Commission on the subject; and the Pentagon tasked one of its top advisory bodies with devising ethical principles for its use of AI. "AI" broadly refers to the science and technology of machines capable of sophisticated information processing. Current applications include face recognition, image analysis, language translation and processing, autonomous vehicles, robotics, game-playing, and recommendation engines. Many more applications are likely to emerge in the coming years and decades.
On December 30, researchers using artificial intelligence systems to comb through media and social platforms detected the spread of an unusual flu-like illness in Wuhan, China. It would be days before the World Health Organization released a risk assessment and a full month before the UN agency declared a global public health emergency for the novel coronavirus. Could the AI systems have accelerated the process and limited, or even arrested, the extent of the COVID-19 pandemic? Clark Freifeld, a Northeastern University computer scientist working with the global disease surveillance platform HealthMap, one of the systems detecting the outbreak, said it remains an open question. "We identified the early signals, but the reality is it's hard to tell when you have an unidentified respiratory illness if it's a really serious situation," said Freifeld.
In this paper, we argue that the effects of artificial intelligence (AI) and automation on growth and employment depend to a large extent on institutions and policies. In the first part of the paper we survey the most recent literature to show that AI can spur growth by replacing labor by capital, both in the production of goods and services and in the production of ideas. However, AI may inhibit growth if combined with inappropriate competition policy. In the second part of the paper we discuss the effect of robotization on employment in France over the 1994–2014 period. Based on our empirical analysis on French data, we first show that robotization reduces aggregate employment at the employment zone level, and second that noneducated workers are more negatively affected by robotization than educated workers. This finding suggests that inappropriate labor market and education policies reduce the positive impact that AI and automation could have on employment. This paper borrows unrestrainedly from our article on AI and economic growth, published in Economics and Statistics (Aghion et al., 2019). Artificial Intelligence (AI) is typically defined as the capability of a machine to imitate intelligent human behavior. True, since 1820 our economies have seen several technological revolutions which resulted in the automation of tasks previously performed by labor.
In 2019, former German Defense Minister Karl-Theodor zu Guttenberg took on a senior leadership position in the artificial intelligence (AI) company Augustus Intelligence. Guttenberg previously urged Europe to take the lead in AI. We have alluded for quite some time to the possibility of Guttenberg leading a united Europe. His gained experience in the United States may further qualify him to lead Europe's digital transformation and more. Augustus Intelligence has recently been involved in a legal dispute with two fired managers, former sales director Marco Pacelli and consultant Ed Crump.
The Black Death in the 1300s broke the long-ingrained feudal system in Europe and replaced it with the more modern employment contract. A mere three centuries later, a deep economic recession -- thanks to the 100-year war between England and France -- kick-started a major innovation drive that radically improved agricultural productivity. Fast forward to more recent times, the SARS pandemic of 2002-2004 catalyzed the meteoric growth of a then-small ecommerce company called Ali Baba and helped establish it at the forefront of retail in Asia. This growth was fueled by underlying anxiety around traveling and human contact, similar to what we see today with Covid-19. The financial crises of 2008 also produced its own disruptive side effects.
Fusion devices called tokamaks run increased risk of disruptions as researchers, aiming to maximize fusion power to create on Earth the fusion that powers the sun and stars, bump up against the operational limits of the facilities. Scientists thus must be able to boost fusion power without hitting those limits. This capability will be crucial for ITER, the large international tokamak under construction in France to demonstrate the practicality of fusion energy. Fusion reactions combine light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei that makes up 99 percent of the visible universe -- to generate massive amounts of energy. Scientists around the world are seeking to create fusion for a virtually inexhaustible supply of safe and clean power to generate electricity.
Lionel Chocron, Chief Product Officer at Hedera Hashgraph, was born in France. About twenty years ago, he came to the US west coast to receive a Master's degree at UC Berkeley, whereafter he eventually took a job working at Cisco. He spent 10 years there focusing on emerging technology--he ran the Internet of Things business unit at Cisco for years--and corporate strategy. He then joined Oracle to lead the emerging technology industry solution group, which was focusing on imaging technology, IoT, blockchain, and AI. He crossed paths in 2019 with the co-founders of Hedera Hashgraph, and grew excited about blockchain technology.
The European Commission says that the EU could become the most attractive, secure and dynamic data-agile economy in the world. The Commission's new data strategy is for the EU to seize new opportunities in digitised industry and business-to-business artificial intelligence (AI) applications. However, the Commission has scrupulously avoided the vital question of whether GDPR is an obstacle to the EU's plans to become an AI hub. The European Commission announced its new EU data strategy with the publication of two papers in February 2020. These were a white paper on AI and a communication entitled, "A European strategy for data".
Yoshua Bengio: Yoshua Bengio OCFRSC (born 1964 in Paris, France) is a Canadian computer scientist, most noted for his work on artificial neural networks and deep learning. He was a co-recipient of the 2018 ACM A.M. Turing Award for his work in deep learning. He is a professor at the Department of Computer Science and Operations Research at the Université de Montréal and scientific director of the Montreal Institute for Learning Algorithms (MILA). Geoffrey Hinton: Geoffrey Everest HintonCCFRSFRSC (born 6 December 1947) is an English Canadian cognitive psychologist and computer scientist, most noted for his work on artificial neural networks. Since 2013 he divides his time working for Google (Google Brain) and the University of Toronto.
The future of extreme weather prediction may lie in modernizing a piece of technology from the past. Researchers recently developed a new technique to augment an old-fashioned weather forecasting method with the power of deep learning, a subset of artificial intelligence (AI). Once the deep learning system is fully trained, it is able to predict extreme weather events like heat waves and cold spells with 80% accuracy up to 5 days beforehand. "This is a very inexpensive way of predicting extreme events at least a few days ahead of time," said Ashesh Chattopadhyay, a mechanical engineering graduate student at Rice University in Houston and lead author on the project. The project began when Pedram Hassanzadeh, an assistant professor of mechanical engineering at Rice, realized that extreme weather events like heat waves and cold spells usually arise from very unusual atmospheric circulation patterns that could potentially be taught to a pattern recognition computer program.