cb77649f5d53798edfa0ff40dae46322-Paper.pdf

Neural Information Processing Systems 

Optimization is akeycomponent for training machine learning models and has a strong impact on their generalization. In this paper, we consider a particular optimization method--the stochastic gradient Langevin dynamics (SGLD) algorithm--and investigate the generalization of models trained by SGLD.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found