Reviews: The Impact of Regularization on High-dimensional Logistic Regression

Neural Information Processing Systems 

Originality: This paper develops asymptotics theory for high-dimensional regularized logistic regression (LR). The main result of the paper (Theorem 1) is proved for any locally-Lipschitz function \Psi which then in special cases provides asymptotics for common descriptive statistics like correlation, variance, mean-squared error. Special case results for L1 and L2 regularized LR are also derived and quantities highlighted in 1 above are derived. The paper also demonstrates that the numerical simulation results align with the theoretical relations. Quality: The paper contains high quality results and proofs, the notation and setup is well defined in section 2 before the main results.