Review for NeurIPS paper: A Dynamical Central Limit Theorem for Shallow Neural Networks

Neural Information Processing Systems 

The paper provides CLT-like results for the dynamics of single-hidden layer, wide neural networks in the mean-field limit. The authors also show that under certain conditions the long-time fluctuations can be controlled with an MC type resampling error. The reviewers had a positive assessment of the finite width analysis and the strength of some of the technical contributions. They did however raise a variety of concerns regarding the asymptotic nature of results (both in n and t), assumptions on Dhat, and lack of results with discretization. While some of these concerns were alleviated based on the authors' response, the more critical reviewers maintained their score and one positive reviewer slightly decreased theirs from 8 to 7. I agree with the reviewers that CLT type results for finite width is indeed interesting.