Russian President Vladimir Putin warned Friday (Sept. AI development "raises colossal opportunities and threats that are difficult to predict now," Putin said in a lecture to students, warning that "it would be strongly undesirable if someone wins a monopolist position." Future wars will be fought by autonomous drones, Putin suggested, and "when one party's drones are destroyed by drones of another, it will have no other choice but to surrender." U.N. urged to address lethal autonomous weapons AI experts worldwide are also concerned. On August 20, 116 founders of robotics and artificial intelligence companies from 26 countries, including Elon Musk and Google DeepMind's Mustafa Suleyman, signed an open letter asking the United Nations to "urgently address the challenge of lethal autonomous weapons (often called'killer robots') and ban their use internationally."
Z Advanced Computing, Inc. (ZAC) of Potomac, MD announced on August 27 that it is funded by the US Air Force, to use ZAC's detailed 3D image recognition technology, based on Explainable-AI, for drones (unmanned aerial vehicle or UAV) for aerial image/object recognition. ZAC is the first to demonstrate Explainable-AI, where various attributes and details of 3D (three dimensional) objects can be recognized from any view or angle. "With our superior approach, complex 3D objects can be recognized from any direction, using only a small number of training samples," said Dr. Saied Tadayon, CTO of ZAC. "For complex tasks, such as drone vision, you need ZAC's superior technology to handle detailed 3D image recognition." "You cannot do this with the other techniques, such as Deep Convolutional Neural Networks, even with an extremely large number of training samples. That's basically hitting the limits of the CNNs," continued Dr. Bijan Tadayon, CEO of ZAC.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Festo's Bionic Learning Network prototypes for this year are a bit less crazy than we're used to, but they're also far more practical, with immediate potential applications, especially in collaborative robotics: Festo presents a bionic gripper called the OctopusGripper, which is derived from an octopus tentacle. Free-moving, intuitive to operate and safe when interacting with the user: the pneumatic lightweight robot is based on the human arm and has great potential as a sensitive helper for human–robot collaboration in the future.
Google has quietly secured a contract to work on the Defense Department's new algorithmic warfare initiative, providing assistance with a pilot project to apply its artificial intelligence solutions to drone targeting. The military contract with Google is routed through a Northern Virginia technology staffing company called ECS Federal, obscuring the relationship from the public. The contract, first reported Tuesday by Gizmodo, is part of a rapid push by the Pentagon to deploy state-of-the-art artificial intelligence technology to improve combat performance. Google, which has made strides in applying its proprietary deep learning tools to improve language translation, and vision recognition, has a cross-team collaboration within the company to work on the AI drone project. The team, The Intercept has learned, is working to develop deep learning technology to help drone analysts interpret the vast image data vacuumed up from the military's fleet of 1,100 drones to better target bombing strikes against the Islamic State.
Over the weekend, experts on military artificial intelligence from more than 80 world governments converged on the U.N. offices in Geneva for the start of a week's talks on autonomous weapons systems. Many of them fear that after gunpowder and nuclear weapons, we are now on the brink of a "third revolution in warfare," heralded by killer robots--the fully autonomous weapons that could decide who to target and kill without human input. With autonomous technology already in development in several countries, the talks mark a crucial point for governments and activists who believe the U.N. should play a key role in regulating the technology. The meeting comes at a critical juncture. In July, Kalashnikov, the main defense contractor of the Russian government, announced it was developing a weapon that uses neural networks to make "shoot-no shoot" decisions.