Arya, Vijay
AI Explainability 360: Impact and Design
Arya, Vijay, Bellamy, Rachel K. E., Chen, Pin-Yu, Dhurandhar, Amit, Hind, Michael, Hoffman, Samuel C., Houde, Stephanie, Liao, Q. Vera, Luss, Ronny, Mojsilovic, Aleksandra, Mourad, Sami, Pedemonte, Pablo, Raghavendra, Ramya, Richards, John, Sattigeri, Prasanna, Shanmugam, Karthikeyan, Singh, Moninder, Varshney, Kush R., Wei, Dennis, Zhang, Yunfeng
We also introduced a taxonomy to The increasing use of artificial intelligence (AI) systems in navigate the space of explanation methods, not only the ten high stakes domains has been coupled with an increase in societal in the toolkit but also the broader literature on explainable demands for these systems to provide explanations for AI. The taxonomy was intended to be usable by consumers their outputs. This societal demand has already resulted in with varied backgrounds to choose an appropriate explanation new regulations requiring explanations (Goodman and Flaxman method for their application. AIX360 differs from other 2016; Wachter, Mittelstadt, and Floridi 2017; Selbst open source explainability toolkits (see Arya et al. (2020) and Powles 2017; Pasternak 2019). Explanations can allow for a list) in two main ways: 1) its support for a broad and users to gain insight into the system's decision-making process, diverse spectrum of explainability methods, implemented in which is a key component in calibrating appropriate a common architecture, and 2) its educational material as trust and confidence in AI systems (Doshi-Velez and Kim discussed below.
One Explanation Does Not Fit All: A Toolkit and Taxonomy of AI Explainability Techniques
Arya, Vijay, Bellamy, Rachel K. E., Chen, Pin-Yu, Dhurandhar, Amit, Hind, Michael, Hoffman, Samuel C., Houde, Stephanie, Liao, Q. Vera, Luss, Ronny, Mojsiloviฤ, Aleksandra, Mourad, Sami, Pedemonte, Pablo, Raghavendra, Ramya, Richards, John, Sattigeri, Prasanna, Shanmugam, Karthikeyan, Singh, Moninder, Varshney, Kush R., Wei, Dennis, Zhang, Yunfeng
As artificial intelligence and machine learning algorithms make further inroads into society, calls are increasing from multiple stakeholders for these algorithms to explain their outputs. At the same time, these stakeholders, whether they be affected citizens, government regulators, domain experts, or system developers, present different requirements for explanations. Toward addressing these needs, we introduce AI Explainability 360 (http://aix360.mybluemix.net/), an open-source software toolkit featuring eight diverse and state-of-the-art explainability methods and two evaluation metrics. Equally important, we provide a taxonomy to help entities requiring explanations to navigate the space of explanation methods, not only those in the toolkit but also in the broader literature on explainability. For data scientists and other users of the toolkit, we have implemented an extensible software architecture that organizes methods according to their place in the AI modeling pipeline. We also discuss enhancements to bring research innovations closer to consumers of explanations, ranging from simplified, more accessible versions of algorithms, to tutorials and an interactive web demo to introduce AI explainability to different audiences and application domains. Together, our toolkit and taxonomy can help identify gaps where more explainability methods are needed and provide a platform to incorporate them as they are developed.
An Axiomatic Framework for Ex-Ante Dynamic Pricing Mechanisms in Smart Grid
Bandyopadhyay, Sambaran (IBM Research) | Narayanam, Ramasuri (IBM Research) | Kumar, Pratyush (IBM Research) | Ramchurn, Sarvapali (University of Southampton) | Arya, Vijay (IBM Research) | Petra, Iskandarbin ( Universiti Brunei Darussalam )
In electricity markets, the choice of the right pricing regime is crucial for the utilities because the price they charge to their consumers, in anticipation of their demand in real-time, is a key determinant of their profits and ultimately their survival in competitive energy markets. Among the existing pricing regimes, in this paper, we consider ex-ante dynamic pricing schemes as (i) they help to address the peak demand problem (a crucial problem in smart grids), and (ii) they are transparent and fair to consumers as the cost of electricity can be calculated before the actual consumption. In particular, we propose an axiomatic framework that establishes the conceptual underpinnings of the class of ex-ante dynamic pricing schemes. We first propose five key axioms that reflect the criteria that are vital for energy utilities and their relationship with consumers. We then prove an impossibility theorem to show that there is no pricing regime that satisfies all the five axioms simultaneously. We also study multiple cost functions arising from various pricing regimes to examine the subset of axioms that they satisfy. We believe that our proposed framework in this paper is first of its kind to evaluate the class of ex-ante dynamic pricing schemes in a manner that can be operationalised by energy utilities.