Aself-consistenttheoryofGaussianProcesses capturesfeaturelearningeffectsinfiniteCNNs

Neural Information Processing Systems 

Despite its theoretical appeal, this viewpoint lacks a crucial ingredient of deep learning in finite DNNs, laying at the heart of their success --feature learning. Here we consider DNNs trained with noisy gradient descent on a large training set and derive a self-consistent Gaussian Process theory accounting forstrongfinite-DNN and feature learning effects.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found