Yet, only a few companies are publicly discussing their ongoing work in this area in a substantive, transparent, and proactive way. Many other companies, however, seem to fear negative consequences (like reputational risk) of sharing their vulnerabilities. Some companies are also waiting for a "finished product," wanting to be able to point to tangible, positive outcomes before they are ready to reveal their work.
"I've got a lot of friends who are gun owners. I've got a lot of friends who are NRA (National Rifle Association). We had responsible gun ownership, but I was taught the right way to respect that tool," he said. "At the same time, their petition that they were speaking about is a very good one. And I also fear that their campaign -- they have to watch that they don't get hijacked.
Ask a person on the street, and chances are they'll tell you they are both optimistic and anxious about AI. The conflicted perspective makes sense--AI is already appearing in ways that have the potential to both scare and inspire us. The 2018 Fjord trends for business, technology, and design suggest a potential path to alleviate those fears: Adopt a values-sensitive framework for Responsible AI.
Recently, Gartner released a series of Predicts 2021 research reports, including one that highlights the serious, wide-reaching ethical and social problems it predicts artificial intelligence (AI) to cause in the next several years. The race to digital transformation and abundance of data has coerced companies to invest in artificial intelligence technologies. And with that, the concept of leveraging responsible AI took central stage in discussions between government, enterprises and other tech purists and critics. A quick search trends shows that the words like "Ethical AI", and "Responsible AI" have gained popularity in the past five years. But what is the reason behind it? Currently, presence of bias in training data for artificial intelligence models and lack of transparency (black box) threaten the possibility of using AI for good.