For every up, there is a down. For every wrong, there is a right, and for every good thing, there is a bad to go with it. There are many benefits that autonomous weapons could bring and if you were ever in a position where you needed a weapon you'd be quite grateful to have one by your side. But, what if that weapon suddenly malfunctioned and aimed at you instead? This is the main worry for a lot of people when they think about autonomous weapons, and it seems that the majority of people in 89 different nations agree too.
Google today released guidelines for the creation of artificial intelligence, which includes a ban on making autonomous weaponry and most applications of AI with the potential to harm people. The guidelines emerge just days after Google announced it would not renew its contract with the U.S. Department of Defense to analyze drone footage. As a self-described AI-first company, proprietor of popular open source frameworks like Kaggle and TensorFlow, and employer of prominent researchers, Google is one of the most influential companies in AI. "We want to be clear that while we are not developing AI for use in weapons, we will continue our work with governments and the military in many other areas. These include cybersecurity, training, military recruitment, veterans' healthcare, and search and rescue," CEO Sundar Pichai said in a blog post. In the post, Pichai spells out the principles that should be considered when creating AI, as well as applications of AI that Google will not pursue.
According to the Google leadership, the company will not renew its Project Maven contract when it expires in 2019. Project Maven is the company's involvement with the U.S. military which involves the use of Artificial Intelligence to detect and identify people or objects in military drone surveillance videos. Many of the employees at Google were upset and 3,000 of them signed a petition voicing their concerns of Google's involvement with the military which could in turn be harmful for Google. They were against the development of image recognition technology which could be used by military drones to identify and track objects. It was reported on June 1 by Gizmodo that the company would not renew the Project Maven contract after June 2019.
Pioneers from the worlds of artificial intelligence and robotics – including Elon Musk and Deepmind's Mustafa Suleyman – have asked the United Nations to ban autonomous weapon systems. A letter from the experts says the weapons currently under development risk opening a "Pandora's box" that if left open could create a dangerous "third revolution in warfare". The open letter coincides with the International Joint Conference on Artificial Intelligence, which is currently being held in Melbourne, Australia. Ahead of the same conference in 2015, the Telsa founder was joined by Steven Hawking, Steve Wozniak and Noam Chomsky in condemning a new "global arms race". Suggestions that warfare will be transformed by artificially intelligent weapons capable of making their own decisions about who to kill are not hyperbolic.