sample-based approach
Explaining Neural Networks without Access to Training Data
Marton, Sascha, Lüdtke, Stefan, Bartelt, Christian, Tschalzev, Andrej, Stuckenschmidt, Heiner
Artificial neural networks achieve impressive results for various modeling tasks [LeCun et al., 2015, Wang et al., 2020]. However, a downside of their superior performance and sophisticated structure is the comprehensibility of the learned models. In many domains, it is crucial to understand the function learned by a neural network, especially when it comes to decisions that affect people [Samek et al., 2019, Molnar, 2020]. A common approach to tackle the problem of interpretability without sacrificing the superior performance is using a surrogate model as gateway to interpretability [Molnar, 2020]. Most existing global surrogate approaches use a distillation procedure to learn the surrogate model based on the predictions of the neural network [Molnar, 2020, Frosst and Hinton, 2017]. Therefore, they query the neural network based on a representative set of samples and the resulting input-output pairs are then used to train the surrogate model. This representative sample usually comprises the training data of the original model, or at least follows its distribution [Molnar, 2020, Lopes et al., 2017]. However, there are many cases where the training data cannot easily be exposed due to privacy or safety concerns [Lopes et al., 2017, Bhardwaj et al., 2019, Nayak et al., 2019].
- North America > United States > Wisconsin > Dane County > Madison (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- Europe > Switzerland > Basel-City > Basel (0.04)
- Asia > Taiwan (0.04)
- Banking & Finance (0.69)
- Health & Medicine > Therapeutic Area > Oncology (0.69)
Beyond Maximum Likelihood and Density Estimation: A Sample-Based Criterion for Unsupervised Learning of Complex Models
Hochreiter, Sepp, Mozer, Michael C.
Two well known classes of unsupervised procedures that can be cast in this manner are generative and recoding models. In a generative unsupervised framework, the environment generates training exampleswhich we will refer to as observations-by sampling from one distribution; the other distribution is embodied in the model. Examples of generative frameworks are mixtures of Gaussians (MoG) [2], factor analysis [4], and Boltzmann machines [8]. In the recoding unsupervised framework, the model transforms points from an obser- vation space to an output space, and the output distribution is compared either to a reference distribution or to a distribution derived from the output distribution.
- North America > United States > Colorado > Boulder County > Boulder (0.14)
- North America > Canada > Ontario > Toronto (0.14)
- Europe > France (0.05)
- (5 more...)
Beyond Maximum Likelihood and Density Estimation: A Sample-Based Criterion for Unsupervised Learning of Complex Models
Hochreiter, Sepp, Mozer, Michael C.
Two well known classes of unsupervised procedures that can be cast in this manner are generative and recoding models. In a generative unsupervised framework, the environment generates training exampleswhich we will refer to as observations-by sampling from one distribution; the other distribution is embodied in the model. Examples of generative frameworks are mixtures of Gaussians (MoG) [2], factor analysis [4], and Boltzmann machines [8]. In the recoding unsupervised framework, the model transforms points from an obser- vation space to an output space, and the output distribution is compared either to a reference distribution or to a distribution derived from the output distribution. An example is independent component analysis (leA) [11], a method that discovers a representation of vector-valued observations in which the statistical dependence among the vector elements in the output space is minimized.
- North America > United States > Colorado > Boulder County > Boulder (0.14)
- North America > Canada > Ontario > Toronto (0.14)
- Europe > France (0.05)
- (4 more...)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.67)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.41)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.41)
Beyond Maximum Likelihood and Density Estimation: A Sample-Based Criterion for Unsupervised Learning of Complex Models
Hochreiter, Sepp, Mozer, Michael C.
Two well known classes of unsupervised procedures that can be cast in this manner are generative and recoding models. In a generative unsupervised framework, the environment generates training exampleswhich we will refer to as observations-by sampling from one distribution; the other distribution is embodied in the model. Examples of generative frameworks are mixtures of Gaussians (MoG) [2], factor analysis [4], and Boltzmann machines [8]. In the recoding unsupervised framework, the model transforms points from an obser- vation space to an output space, and the output distribution is compared either to a reference distribution or to a distribution derived from the output distribution. An example is independent component analysis (leA) [11], a method that discovers a representation of vector-valued observations in which the statistical dependence among the vector elements in the output space is minimized.
- North America > United States > Colorado > Boulder County > Boulder (0.14)
- North America > Canada > Ontario > Toronto (0.14)
- Europe > France (0.05)
- (4 more...)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.67)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.41)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.41)