A new generation of autonomous weapons or "killer robots" could accidentally start a war or cause mass atrocities, a former top Google software engineer has warned. Laura Nolan, who resigned from Google last year in protest at being sent to work on a project to dramatically enhance US military drone technology, has called for all AI killing machines not operated by humans to be banned. Nolan said killer robots not guided by human remote control should be outlawed by the same type of international treaty that bans chemical weapons. Unlike drones, which are controlled by military teams often thousands of miles away from where the flying weapon is being deployed, Nolan said killer robots have the potential to do "calamitous things that they were not originally programmed for". There is no suggestion that Google is involved in the development of autonomous weapons systems.
Increasingly sophisticated killer AI robots and machines could accidentally start a war and lead to mass atrocities, an ex-Google worker has told The Guardian. Laura Nolan resigned from Google last year in protest at being assigned to Project Maven, which was aimed at enhancing U.S. military drone technology. She has called for all unmanned autonomous weapons to be banned. AI killer robots have the potential to do "calamitous things that they were not originally programmed for," Nolan explained to the Guardian. She is part of a growing group of experts that are showing concern over the development of artificial intelligence programmed into war machines.
A former Google engineer has expressed fears about a new generation of robots that could carry out'atrocities and unlawful killings'. Laura Nolan, who previously worked on the tech giant's military drone initiative, Project Maven, is calling for the ban of all autonomous war drones, as these machines do not have the same common sense or discernment as humans. Project Maven focused on enhancing drones with artificial intelligence (AI) to distinguish enemy targets from people and other objects – but was discontinued after employees protested the technology in development, calling it'evil'. Nolan, who left Google in 2018 in protest against the US military drone technology, is now calling for all drones not operated by humans to fall under the same ban as chemical weapons, according to The Guardian. Former Google engineer has expressed fears about a new generation of robots that could carryout'atrocities and unlawful killings'.
Advancements in artificial intelligence may result in "atrocities" because the technology will behave in unexpected ways, a former Google software engineer has warned. Computer scientist Laura Nolan left Google in June last year after raising concerns about its work with the U.S. Department of Defense on Project Maven, a drone program that was using AI algorithms to speed up analysis of vast amounts of captured surveillance footage. Speaking to The Guardian, the software engineer said the use of autonomous or AI-enhanced weapons systems that lack a human touch may have severe, even fatal, consequences. She said: "What you are looking at are possible atrocities and unlawful killings even under laws of warfare, especially if hundreds or thousands of these machines are deployed. There could be large-scale accidents because these things will start to behave in unexpected ways.
A former Google software engineer is sounding the alarm on killer robots. Laura Nolan resigned from Google last year when the tech giant started working with the U.S. military on drone technology, and since then, she has joined the Campaign to Stop Killer Robots, warning that autonomous robots with lethal capabilities could become a threat to humanity. Discussions concerning possibly banning autonomous weapons fell apart on August 21 during a United Nations meeting in Geneva, when Russian diplomats allegedly made a fuss over the language that was used in a document meant to begin the process of establishing a ban. "If you're a despot, how much easier is it to have a small cadre of engineers control a fleet of autonomous weapons for you than to have to keep your troops in line?" Nolan tells Inverse. "Autonomous weapons are potential weapons of mass destruction. They need to be made taboo in the same way that chemical and biological weapons are."