Segment Integrated Gradients: Better attributions through regions

Kapishnikov, Andrei, Bolukbasi, Tolga, Viégas, Fernanda, Terry, Michael

arXiv.org Machine Learning 

These evaluation methods provide ways to validate the saliency Saliency methods can aid understanding of deep neural method's outputs (e.g., to ensure they can be relied upon networks. Recent years have witnessed many improvements to explain model behavior) [2, 1], or to empirically measure to saliency methods, as well as new ways for evaluating the methods' outputs, enabling comparison of two or them. In this paper, we 1) present a novel region-based more techniques. For example, "sanity checks" have been attribution method, Segment-Integrated Gradients (SIG), developed that help determine whether a saliency method's that builds upon integrated gradients [23], 2) introduce results meaningfully correspond to a model's learned parameters evaluation methods for empirically assessing the quality [1], while Sensitivity-n [2] empirically measures of image-based saliency maps (Performance Information the quality of a saliency method's output by comparing the Curves (PICs)), and 3) contribute an axiom-based sanity change in the output prediction to the sum of attributions.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found