Review for NeurIPS paper: Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance
–Neural Information Processing Systems
Additional Feedback: The list of remarks and questions I have: * The current standard for regularization of OT is entropic regularization of the plan (papers of Cuturi [5], and also sample complexity results [3,4]). This paper seems to mostly ignore this literature, which is quite weird, given the fact that the goals are (almost) the same. Given the fact that entropic regularization can (should) be viewed as a "cheap proxy" for Gaussian smoothing, a proper and detailed comparison seems in order. The authors seem to be using a re-sampling scheme "Sampling from P_n and N(sigma)_x0000_ and adding the obtained values produces samples from P_n * N(sigma)". But the potential problem is that to cope with the CoD, this might requires a number of samples exponential in the dimension.
Neural Information Processing Systems
Jan-22-2025, 05:43:03 GMT
- Technology: