deep diffusion-invariant wasserstein distributional classification
Deep Diffusion-Invariant Wasserstein Distributional Classification
In this paper, we present a novel classification method called deep diffusion-invariant Wasserstein distributional classification (DeepWDC). DeepWDC represents input data and labels as probability measures to address severe perturbations in input data. It can output the optimal label measure in terms of diffusion invariance, where the label measure is stationary over time and becomes equivalent to a Gaussian measure. Furthermore, DeepWDC minimizes the 2-Wasserstein distance between the optimal label measure and Gaussian measure, which reduces the Wasserstein uncertainty. Experimental results demonstrate that DeepWDC can substantially enhance the accuracy of several baseline deterministic classification methods and outperforms state-of-the-art-methods on 2D and 3D data containing various types of perturbations (e.g., rotations, impulse noise, and down-scaling).
Review for NeurIPS paper: Deep Diffusion-Invariant Wasserstein Distributional Classification
Strengths: * The paper addresses the iinteresting concept of classification problem where inputs and targets are both viewed as represented by prpbability measures. The setting is novel and the proposed derivations are theoretically and empirically supported. The proposed architecture comprises two architectures: a measure to measure mapping network f which realizes a push-forward operation and a prediction network g which inputs the measure issued from f and allows to predict the final label. An explicit formulation of the diffusion operator amenable to computation in deep learning setting is described. Also a theoretical justification of the exponential decay of the diffusion-invariance term is provided.
Deep Diffusion-Invariant Wasserstein Distributional Classification
In this paper, we present a novel classification method called deep diffusion-invariant Wasserstein distributional classification (DeepWDC). DeepWDC represents input data and labels as probability measures to address severe perturbations in input data. It can output the optimal label measure in terms of diffusion invariance, where the label measure is stationary over time and becomes equivalent to a Gaussian measure. Furthermore, DeepWDC minimizes the 2-Wasserstein distance between the optimal label measure and Gaussian measure, which reduces the Wasserstein uncertainty. Experimental results demonstrate that DeepWDC can substantially enhance the accuracy of several baseline deterministic classification methods and outperforms state-of-the-art-methods on 2D and 3D data containing various types of perturbations (e.g., rotations, impulse noise, and down-scaling).