The Hunt for Explainable AI

#artificialintelligence 

The notion that we should understand how artificial intelligences make decisions is gaining increasing currency. As we face a future in which important decisions affecting the course of our lives may be made by artificial intelligence (AI), the idea that we should understand how AIs make decisions is gaining increasing currency. Which hill to position a 20-year-old soldier on, who gets (or does not get) a home mortgage, which treatment a cancer patient receives … such decisions, and many more, already are being made based on an often unverifiable technology. "The problem is that not all AI approaches are created equal," says Jeff Nicholson, a vice president at Pega Systems Inc., makers of AI-based Customer Relationship Management (CRM) software. "Certain'black box' approaches to AI are opaque and simply cannot be explained."