Over the weekend, experts on military artificial intelligence from more than 80 world governments converged on the U.N. offices in Geneva for the start of a week's talks on autonomous weapons systems. Many of them fear that after gunpowder and nuclear weapons, we are now on the brink of a "third revolution in warfare," heralded by killer robots--the fully autonomous weapons that could decide who to target and kill without human input. With autonomous technology already in development in several countries, the talks mark a crucial point for governments and activists who believe the U.N. should play a key role in regulating the technology. The meeting comes at a critical juncture. In July, Kalashnikov, the main defense contractor of the Russian government, announced it was developing a weapon that uses neural networks to make "shoot-no shoot" decisions.
A key opponent of high-tech, automated weapons known as'killer robots' is blaming countries like the U.S. and Russia for blocking consensus at a U.N.-backed conference, where most countries wanted to ensure that humans stay at the controls of lethal machines. Coordinator Mary Wareham of the Campaign to Stop Killer Robots spoke Monday after experts from dozens of countries agreed before dawn Saturday at the U.N. in Geneva on 10 'possible guiding principles' about such'Lethal Automated Weapons Systems.' Point 2 said: 'Human responsibility for decisions on the use of weapons systems must be retained since accountability cannot be transferred to machines.' Killer robots must be banned to prevent unlawful killings, injuries and other violations of human rights'before it's too late', according to Amnesty International. Wareham said such language wasn't binding, adding that'it's time to start laying down some rules now.' Members of the LAWS conference will meet again in November. Last week Amnesty International said killer robots must be banned to prevent unlawful killings, injuries and other violations of human rights'before it's too late', as the talks kicked off.
A very, very small quadcopter, one inch in diameter can carry a one- or two-gram shaped charge. You can order them from a drone manufacturer in China. You can program the code to say: "Here are thousands of photographs of the kinds of things I want to target." A one-gram shaped charge can punch a hole in nine millimeters of steel, so presumably you can also punch a hole in someone's head. You can fit about three million of those in a semi-tractor-trailer. You can drive up I-95 with three trucks and have 10 million weapons attacking New York City. They don't have to be very effective, only 5 or 10% of them have to find the target. There will be manufacturers producing millions of these weapons that people will be able to buy just like you can buy guns now, except millions of guns don't matter unless you have a million soldiers. You need only three guys to write the program and launch them. So you can just imagine that in many parts of the world humans will be hunted. They will be cowering underground in shelters and devising techniques so that they don't get detected. This is the ever-present cloud of lethal autonomous weapons. Mary Wareham laughs a lot. It usually sounds the same regardless of the circumstance -- like a mirthful giggle the blonde New Zealander can't suppress -- but it bubbles up at the most varied moments. Wareham laughs when things are funny, she laughs when things are awkward, she laughs when she disagrees with you. And she laughs when things are truly unpleasant, like when you're talking to her about how humanity might soon be annihilated by killer robots and the world is doing nothing to stop it. One afternoon this spring at the United Nations in Geneva, I sat behind Wareham in a large wood-paneled, beige-carpeted assembly room that hosted the Convention on Certain Conventional Weapons (CCW), a group of 121 countries that have signed the agreement to restrict weapons that "are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately"-- in other words, weapons humanity deems too cruel to use in war. The UN moves at a glacial pace, but the CCW is even worse.
Campaigners are renewing calls for a pre-emptive ban on so-called "killer robots" as representatives of more than 80 countries meet to discuss the autonomous weapons systems. The use of lethal autonomous weapons systems (LAWS) is "a step too far", said Mary Wareham, the global coordinator of the Campaign to Stop Killer Robots. "They cross a moral line, because we would see machines taking human lives on the battlefield or in law enforcement. "We want weapon systems and the use of force to remain under human control," Wareham said. Wareham spoke to Al Jazeera before Monday's meeting in Geneva, Switzerland on a possible ban on LAWS.