Review for NeurIPS paper: Constant-Expansion Suffices for Compressed Sensing with Generative Priors

Neural Information Processing Systems 

Summary and Contributions: This paper is about compressed sensing (CS) under generative priors. In such a problem, undersampled linear measurements of a signal of interest are provided, and the signal is sought. The mathematical ambiguity is resolved by finding the feasible point that is in the range of a trained generative model (such as a GAN), which is itself computed by solving an empirical risk minimization. Existing theory establishes a convergence guarantee of an efficient algorithm under an appropriate random model for the weights of the generative prior. The convergence guarantee assumes that the generative model is a multilayer perceptron where the width of each layer grows log-linearly.