Is 'killer robot' warfare closer than we think?

BBC News

More than 100 of the world's top robotics experts wrote a letter to the United Nations recently calling for a ban on the development of "killer robots" and warning of a new arms race. But are their fears really justified? Entire regiments of unmanned tanks; drones that can spot an insurgent in a crowd of civilians; and weapons controlled by computerised "brains" that learn like we do, are all among the "smart" tech being unleashed by an arms industry many believe is now entering a "third revolution in warfare". "In every sphere of the battlefield - in the air, on the sea, under the sea or on the land - the military around the world are now demonstrating prototype autonomous weapons," says Toby Walsh, professor of artificial intelligence at Sydney's New South Wales University. "New technologies like deep learning are helping drive this revolution.


China selling deadly AI 'Blowfish' drones that decide who lives and who dies to Middle East war zones

#artificialintelligence

CHINA is selling deadly'Blowfish' drones which can decide who lives and who dies to armies in the war-torn Middle East, say reports. The unmanned war machines are capable of launching autonomous strikes with their arsenal of mortar shells, grenade launchers and machine guns. They are said to be "impossible to defend" against and the Pentagon has already made it clear it fears they will end up in the wrong hands. Some military experts fear the proposed sale of the AI mini-choppers will spark even more bloodshed in the troubled region, reports news.com. "They would be impossible to defend yourself against," warns University of New South Wales Professor of Artificial Intelligence Toby Walsh.


South Korean university's AI work for defense contractor draws boycott

#artificialintelligence

An autonomous sentry freezes an "intruder" during a 2006 test of the weapons system by the South Korean military. Fifty-seven scientists from 29 countries have called for a boycott of a top South Korean university because of a new center aimed at using artificial intelligence (AI) to bolster national security. The AI scientists claim the university is developing autonomous weapons, or "killer robots," whereas university officials say the goal of the research is to improve existing defense systems. A web page that has since been removed by the university said the center, to be operated jointly with South Korean defense company Hanwha Systems, would work on "AI-based command and decision systems, composite navigation algorithms for mega-scale unmanned undersea vehicles, AI-based smart aircraft training systems, and AI-based smart object tracking and recognition technology." Toby Walsh, a computer scientist at the University of New South Wales in Sydney, Australia, who organized the boycott, fears that the research will be applied to autonomous weapons, which can include unmanned flying drones or submarines, cruise missiles, autonomously operated sentry guns, or battlefield robots.


'Killer robots': AI experts call for boycott over lab at South Korea university

The Guardian

Artificial intelligence researchers from nearly 30 countries are boycotting a South Korean university over concerns a new lab in partnership with a leading defence company could lead to "killer robots". More than 50 leading academics signed the letter calling for a boycott of Korea Advanced Institute of Science and Technology (KAIST) and its partner, defence manufacturer Hanwha Systems. The researchers said they would not collaborate with the university or host visitors from KAIST over fears it sought to "accelerate the arms race to develop" autonomous weapons. "There are plenty of great things you can do with AI that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern," said Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales. "This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms."


Rise of the machines?

FOX News

But it could be a real threat, warn researchers at the recent World Economic Forum. Unlike today's drones, which are still controlled by human operators, autonomous weapons could potentially be programmed to select and engage targets on their own. "It was one of the concerns that we itemized last year," Toby Walsh, professor of artificial intelligence (AI) at the school of computer science and engineering at the University of New South Wales, told FoxNews.com. "Most of us believe that we don't have the ability to build ethical robots," he added. "What is especially worrying is that the various militaries around the world will be fielding robots in just a few years, and we don't think anyone will be building ethical robots."