Goto

Collaborating Authors

Results


Trust Algorithms? The Army Doesn't Even Trust Its Own AI Developers - War on the Rocks

#artificialintelligence

Last month, an artificial intelligence agent defeated human F-16 pilots in a Defense Advanced Research Projects Agency challenge, reigniting discussions about lethal AI and whether it can be trusted. Allies, non-government organizations, and even the U.S. Defense Department have weighed in on whether AI systems can be trusted. But why is the U.S. military worried about trusting algorithms when it does not even trust its AI developers? Any organization's adoption of AI and machine learning requires three technical tools: usable digital data that machine learning algorithms learn from, computational capabilities to power the learning process, and the development environment that engineers use to code. However, the military's precious few uniformed data scientists, machine learning engineers, and data engineers who create AI-enabled applications are currently hamstrung by a lack of access to these tools.