Pulling back the curtain on neural networks

AIHub 

When researchers at Oregon State University created new tools to evaluate the decision-making algorithms of an advanced artificial intelligence system, study participants assigned to use them did, indeed, find flaws in the AI's reasoning. But once investigators instructed participants to use the tools in a more structured and rigorous way, the number of bugs they discovered increased markedly. "That surprised us a bit, and it showed that having good tools for visualizing and interfacing with AI systems is important, but it's only part of the story," said Alan Fern, professor of computer science at Oregon State. Since 2017, Fern has led a team of eight computer scientists funded by a four-year, $7.1 million grant from the Defense Advanced Research Projects Agency to develop explainable artificial intelligence, or XAI -- algorithms through which humans can understand, build trust in, and manage the emerging generation of artificial intelligence systems. Dramatic advancements in the artificial neural networks, or ANNs, at the heart of advanced AI have created a wave of powerful applications for transportation, defense, security, medicine, and other fields.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found