Video Friday: Robot Tentacle, Mars Flyer, and Destructive Drone Competition

IEEE Spectrum Robotics Channel

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Festo's Bionic Learning Network prototypes for this year are a bit less crazy than we're used to, but they're also far more practical, with immediate potential applications, especially in collaborative robotics: Festo presents a bionic gripper called the OctopusGripper, which is derived from an octopus tentacle. Free-moving, intuitive to operate and safe when interacting with the user: the pneumatic lightweight robot is based on the human arm and has great potential as a sensitive helper for human–robot collaboration in the future.


Leading AI country will be 'ruler of the world,' says Putin

@machinelearnbot

Russian President Vladimir Putin warned Friday (Sept. AI development "raises colossal opportunities and threats that are difficult to predict now," Putin said in a lecture to students, warning that "it would be strongly undesirable if someone wins a monopolist position." Future wars will be fought by autonomous drones, Putin suggested, and "when one party's drones are destroyed by drones of another, it will have no other choice but to surrender." U.N. urged to address lethal autonomous weapons AI experts worldwide are also concerned. On August 20, 116 founders of robotics and artificial intelligence companies from 26 countries, including Elon Musk and Google DeepMind's Mustafa Suleyman, signed an open letter asking the United Nations to "urgently address the challenge of lethal autonomous weapons (often called'killer robots') and ban their use internationally."


A Global Arms Race for Killer Robots Is Transforming the Battlefield

TIME

Over the weekend, experts on military artificial intelligence from more than 80 world governments converged on the U.N. offices in Geneva for the start of a week's talks on autonomous weapons systems. Many of them fear that after gunpowder and nuclear weapons, we are now on the brink of a "third revolution in warfare," heralded by killer robots--the fully autonomous weapons that could decide who to target and kill without human input. With autonomous technology already in development in several countries, the talks mark a crucial point for governments and activists who believe the U.N. should play a key role in regulating the technology. The meeting comes at a critical juncture. In July, Kalashnikov, the main defense contractor of the Russian government, announced it was developing a weapon that uses neural networks to make "shoot-no shoot" decisions.


Pentagon's artificial intelligence programs get huge boost in defense budget

#artificialintelligence

On Monday, President Trump signed the the $717 billion annual National Defense Authorization Act, which was easily passed by Congress in weeks prior. Much attention has understandably been placed on big-ticket items like $7.6 billion for acquiring 77 F-35 fighters, $21.9 billion for the nuclear weapons program, and $1.56 billion for three littoral combat ships--despite the fact that the Navy requested only one in the budget. What has gotten less attention is how the bill cements artificial intelligence programs in the Defense Department and lays the groundwork for a new national-level policy and strategy in the form of an artificial intelligence commission. As artificial intelligence and machine learning algorithms are integrated into defense technology, spending on these technologies is only going to increase in years to come. While spending for many AI programs in the NDAA is in the tens of millions at present, one budget for a project that did not go through the normal appropriations process could have a total cost of $1.75 billion over the next seven years.


Google Is Quietly Providing AI Technology for Drone Strike Targeting Project

#artificialintelligence

Google has quietly secured a contract to work on the Defense Department's new algorithmic warfare initiative, providing assistance with a pilot project to apply its artificial intelligence solutions to drone targeting. The military contract with Google is routed through a Northern Virginia technology staffing company called ECS Federal, obscuring the relationship from the public. The contract, first reported Tuesday by Gizmodo, is part of a rapid push by the Pentagon to deploy state-of-the-art artificial intelligence technology to improve combat performance. Google, which has made strides in applying its proprietary deep learning tools to improve language translation, and vision recognition, has a cross-team collaboration within the company to work on the AI drone project. The team, The Intercept has learned, is working to develop deep learning technology to help drone analysts interpret the vast image data vacuumed up from the military's fleet of 1,100 drones to better target bombing strikes against the Islamic State.