" Why Not Other Classes? ": Towards Class-Contrastive Back-Propagation Explanations

Neural Information Processing Systems 

Existing explanation methods are often limited to explaining predictions of a pre-specified class, which answers the question "why is the input classified into this class?" However, such explanations with respect to a single class are inherently insufficient because they do not capture features with class-discriminative power. That is, features that are important for predicting one class may also be important for other classes.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found