Review for NeurIPS paper: Deep Diffusion-Invariant Wasserstein Distributional Classification

Neural Information Processing Systems 

Strengths: * The paper addresses the iinteresting concept of classification problem where inputs and targets are both viewed as represented by prpbability measures. The setting is novel and the proposed derivations are theoretically and empirically supported. The proposed architecture comprises two architectures: a measure to measure mapping network f which realizes a push-forward operation and a prediction network g which inputs the measure issued from f and allows to predict the final label. An explicit formulation of the diffusion operator amenable to computation in deep learning setting is described. Also a theoretical justification of the exponential decay of the diffusion-invariance term is provided.