Fact and Fiction Behind the Threat of 'Killer AI'

#artificialintelligence 

However, Oren Etzioni, professor of Computer Science at the University of Washington and CEO of the Allen Institute for Artificial Intelligence, argues that such headlines are in fact strongly influenced by the work of one man: professor Nick Bostrom of the Faculty of Philosophy at Oxford University, author of the bestselling treatise Superintelligence: Paths, Dangers, and Strategies. Essentially, Bostrom claims that if machine brains surpass human brains in general intelligence, the resultant new'superintelligence' could replace humans as the dominant lifeform on Earth. Furthermore, according to his findings, there's a 10-percent probability that human-level AI will be attained by 2022, a 50-percent probability that this feat will be achieved by 2040, and 90-percent probability that such an entity will be created by 2075. However, in his article published in the MIT Technology Review magazine Etzioni points out that Bostrom's main source of data is an aggregate of four different surveys of groups, including participants of the Philosophy and Theory of AI conference that was held in 2011 in Thessaloniki, and members of the Greek Association for Artificial Intelligence. Furthermore, it appears that Bostrom didn't provide the response rates or the phrasing of questions used during those surveys, and neither did he account for the reliance on data collected in Greece.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found