AI Accountability: Proceed at Your Own Risk - InformationWeek

#artificialintelligence 

A report issued by technology research firm Forrester, AI Aspirants: Caveat Emptor, highlights the growing need for third-party accountability in artificial intelligence tools. The report found that a lack of accountability in AI can result in regulatory fines, brand damage, and lost customers, all of which can be avoided by performing third-party due diligence and adhering to emerging best practices for responsible AI development and deployment. The risks of getting AI wrong are real and, unfortunately, they're not always directly within the enterprise's control, the report observed. "Risk assessment in the AI context is complicated by a vast supply chain of components with potentially nonlinear and untraceable effects on the output of the AI system," it stated. Most enterprises partner with third parties to create and deploy AI systems because they don't have the necessary technology and skills in house to perform these tasks on their own, said report author Brandon Purcell, a Forrester principal analyst who covers customer analytics and artificial intelligence issues.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found