"Banning killer robots is both politically savvy and morally necessary," said Mary Wareham, the Arms Division advocacy director at Human Rights Watch and coordinator of the Campaign to Stop Killer Robots. "European states should take the lead and open ban treaty negotiations if they are serious about protecting the world from this horrific development." Countries attending the annual meeting of states parties to the Convention on Conventional Weapons (CCW) at the United Nations in Geneva will decide on November 15 whether to continue diplomatic talks on killer robots, also known as lethal autonomous weapons systems or fully autonomous weapons. Since 2014, these states have held eight meetings on lethal autonomous weapons systems under the auspices of the Convention on Conventional Weapons (CCW), a major disarmament treaty. Over the course of those meetings, states have built a shared understanding of concern, but they have struggled to reach agreement on credible recommendations for multilateral action due to the objections of a handful of military powers, most notably Russia and the United States.
On International Women's Day, weapons development won't be the first thing that springs to mind for achieving global gender equality. But banning autonomous weapons systems AKA "killer robots" is needed to strengthen global peace, advance human security and ensure a feminist future. Technology could be a benevolent force in our increasingly integrated society. The potential benefits of innovative advancements in the fields of artificial intelligence, robotics, and machine learning could secure our future. As United Nations Secretary General Antonio Guterres said: "…these new capacities can help us to lift millions of people out of poverty, achieve the Sustainable Development Goals and enable developing countries to leap‑frog into a better future."
NAIROBI (Thomson Reuters Foundation) - Countries are rapidly developing "killer robots" - machines with artificial intelligence (AI) that independently kill - but are moving at a snail's pace on agreeing global rules over their use in future wars, warn technology and human rights experts. From drones and missiles to tanks and submarines, semi-autonomous weapons systems have been used for decades to eliminate targets in modern day warfare - but they all have human supervision. Nations such as the United States, Russia and Israel are now investing in developing lethal autonomous weapons systems (LAWS) which can identify, target, and kill a person all on their own - but to date there are no international laws governing their use. "Some kind of human control is necessary ... Only humans can make context-specific judgements of distinction, proportionality and precautions in combat," said Peter Maurer, President of the International Committee of the Red Cross (ICRC).
On a video screen projected to a crowd attending the United Nations' summit Convention on Certain Conventional Weapons, a small drone whizzes past a tech executive. It shoots a projectile into the skull of a test dummy, detonating an explosive that could kill a human. In front of an audience, the executive pitches the drones as "unstoppable" and calls them capable of "an airstrike of surgical precision" that could render nuclear weapons obsolete. The scene quickly cuts to the drones being hijacked by terrorist organizations and going on a killing spree, targeting politicians and social media activists. The film, titled Slaughterbots and produced by the Future of Life Institute, shows how easily autonomous weapons could become weapons of mass destruction.