Should We Fear Artificial Superintelligence?
Speaking at a conference in Lisbon, Portugal shortly before his death, Stephen Hawking told attendees that the development of artificial intelligence might become the "worst event in the history of our civilization," and he had every reason for concern. Known as an artificial superintelligence (ASI) by AI researchers, ethicists, and others, it has the potential to become more powerful than anything this planet has ever seen and it poses what will likely be the final existential challenge humanity will ever face as a species. To better understand what concerned Stephen Hawking, Elon Musk, and many others, we need to deconstruct many of the popular culture depictions of AI. The reality is that AI has been with us for a while now, ever since computers were able to make decisions based on inputs and conditions. When we see a threatening AI system in the movies, it's the malevolence of the system, coupled with the power of a computer, that scares us.
Feb-24-2019, 09:47:25 GMT