Don't fear the robopocalypse: Autonomous weapons expert Paul Scharre

#artificialintelligence

The Doomsday Clock is an internationally recognized design that conveys how close we are to destroying our civilization with dangerous technologies of our own making. First and foremost among these are nuclear weapons, but the dangers include climate-changing technologies, emerging... Read More


The world's top artificial intelligence companies are pleading for a ban on killer robots

#artificialintelligence

Elon Musk, founder, CEO and lead designer at SpaceX and co-founder of Tesla, speaks at the International Space Station Research and Development Conference in Washington, U.S., July 19, 2017. A revolution in warfare where killer robots, or autonomous weapons systems, are common in battlefields is about to start. Both scientists and industry are worried. The world's top artificial intelligence (AI) and robotics companies have used a conference in Melbourne to collectively urge the United Nations to ban killer robots or lethal autonomous weapons. An open letter by 116 founders of robotics and artificial intelligence companies from 26 countries was launched at the world's biggest artificial intelligence conference, the International Joint Conference on Artificial Intelligence (IJCAI), as the UN delays meeting until later this year to discuss the robot arms race.


The world's top artificial intelligence companies are pleading for a ban on killer robots

#artificialintelligence

Elon Musk, founder, CEO and lead designer at SpaceX and co-founder of Tesla, speaks at the International Space Station Research and Development Conference in Washington, U.S., July 19, 2017. A revolution in warfare where killer robots, or autonomous weapons systems, are common in battlefields is about to start. Both scientists and industry are worried. The world's top artificial intelligence (AI) and robotics companies have used a conference in Melbourne to collectively urge the United Nations to ban killer robots or lethal autonomous weapons. An open letter by 116 founders of robotics and artificial intelligence companies from 26 countries was launched at the world's biggest artificial intelligence conference, the International Joint Conference on Artificial Intelligence (IJCAI), as the UN delays meeting until later this year to discuss the robot arms race.


Thousands of scientists pledge not to help build killer AI robots

#artificialintelligence

Thousands of scientists who specialise in artificial intelligence (AI) have declared that they will not participate in the development or manufacture of robots that can identify and attack people without human oversight. Demis Hassabis at Google DeepMind and Elon Musk at the US rocket company SpaceX are among more than 2,400 signatories to the pledge which intends to deter military firms and nations from building lethal autonomous weapon systems, also known as Laws. The move is the latest from concerned scientists and organisations to highlight the dangers of handing over life and death decisions to AI-enhanced machines. It follows calls for a preemptive ban on technology that campaigners believe could usher in a new generation of weapons of mass destruction. Orchestrated by the Boston-based organisation, The Future of Life Institute, the pledge calls on governments to agree norms, laws and regulations that stigmatise and effectively outlaw the development of killer robots.


Lethal Microdrones, Dystopian Futures, and the Autonomous Weapons Debate

IEEE Spectrum Robotics

This week, the first meeting of the Convention on Conventional Weapons (CCW) Group of Governmental Experts on lethal autonomous weapons systems is taking place at the United Nations in Geneva. Organizations like the Campaign to Stop Killer Robots are encouraging the UN to move forward on international regulation of autonomous weapons, which is great, because talking about how these issues will shape the future of robotics and society is a very important thing.