Reviews: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes

Neural Information Processing Systems 

This review has 2 parts. The first part is my review of the paper as a standalone paper. The second part is a meta-commentary unifying my reviews for both this paper and "Neural Tangent Kernel for Any Architecture". Part 1 This paper demonstrates that infinitely-wide architectures made from a range of building blocks are Gaussian processes. Fundamentally, the paper seems to have two core contributions. This paper is a clean, elegant and logical next step in an important research direction.