From the battlefield to the back office, artificial intelligence has the potential to transform how the Defense Department does business in areas like increasing the speed of decision making, making sense of complex data sets and improving efficiency in back-office operations. Ensuring that AI is developed, procured and used responsibly and ethically is a top priority for the department's top leader. "As the Department of Defense embraces artificial intelligence, it is imperative that we adopt responsible behavior, processes and outcomes in a manner that reflects the department's commitment to its core set of ethical principles," Deputy Secretary of Defense Dr. Kathleen Hicks wrote in a department-wide memorandum released last week. As part of that commitment to responsible artificial intelligence, or RAI, the memorandum sets forth foundational tenets for implementation across the department including a governance structure and processes to provide oversight and accountability; warfighter trust to ensure fidelity in the AI capability and its use, a systems engineering and risk management approach to implementation in the AI product and acquisition lifecycle; a robust ecosystem to ensure collaboration across government, academia, industry, and allies and build an AI-ready workforce. The memorandum also spelled out how the Joint Artificial Intelligence Center will serve as the lead to coordinate the implementation and oversight of the department's RAI efforts.
"Robots exist in an open world where you can't predict everything that's going to happen. The robot has to have some autonomy in order to act and react in a real situation. It needs to make decisions to protect itself, but it also needs to transfer control to humans when appropriate. You don't want a robot to drive off a ledge, for instance -- unless a human needs the robot to drive off the ledge. When those situations happen, you need to have smooth transfer of control from the robot to the appropriate human," Woods said.
Yet, only a few companies are publicly discussing their ongoing work in this area in a substantive, transparent, and proactive way. Many other companies, however, seem to fear negative consequences (like reputational risk) of sharing their vulnerabilities. Some companies are also waiting for a "finished product," wanting to be able to point to tangible, positive outcomes before they are ready to reveal their work.
"I've got a lot of friends who are gun owners. I've got a lot of friends who are NRA (National Rifle Association). We had responsible gun ownership, but I was taught the right way to respect that tool," he said. "At the same time, their petition that they were speaking about is a very good one. And I also fear that their campaign -- they have to watch that they don't get hijacked.