On the Impacts of the Random Initialization in the Neural Tangent Kernel Theory Yicheng Li Department of Statistics and Data Science Department of Statistics and Data Science Tsinghua University
–Neural Information Processing Systems
This paper aims to discuss the impact of random initialization of neural networks in the neural tangent kernel (NTK) theory, which is ignored by most recent works in the NTK theory. It is well known that as the network's width tends to infinity, the neural network with random initialization converges to a Gaussian process f
Neural Information Processing Systems
May-29-2025, 06:18:30 GMT