Review for NeurIPS paper: Finite Versus Infinite Neural Networks: an Empirical Study

Neural Information Processing Systems 

This paper conducted thorough experiments to compare performances of finite with and infinite width networks. The network architectures investigated are FNN, CNN with/without global average pooling (GAP), and two parameterizations of them (standard one and NTK one) were compared. Several techniques such as regularization and ensemble learning are applied to these methods. Throughout the experiments under several different settings, they derived several conclusions from several view points. They also developed best practices for using non-trainable kernels on the CIFAR-10 classification task.