Goto

Collaborating Authors

 nip


A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization

Neural Information Processing Systems

We analyze stochastic gradient algorithms for optimizing nonconvex, nonsmooth finite-sum problems. In particular, the objective function is given by the summation of a differentiable (possibly nonconvex) component, together with a possibly non-differentiable but convex component. We propose a proximal stochastic gradient algorithm based on variance reduction, called ProxSVRG+. Our main contribution lies in the analysis of ProxSVRG+. It recovers several existing convergence results and improves/generalizes them (in terms of the number of stochastic gradient oracle calls and proximal oracle calls). In particular, ProxSVRG+ generalizes the best results given by the SCSG algorithm, recently proposed by [Lei et al., NIPS'17] for the smooth nonconvex case. ProxSVRG+ is also more straightforward than SCSG and yields simpler analysis. Moreover, ProxSVRG+ outperforms the deterministic proximal gradient descent (ProxGD) for a wide range of minibatch sizes, which partially solves an open problem proposed in [Reddi et al., NIPS'16].




Reviews: Projected Stein Variational Newton: A Fast and Scalable Bayesian Inference Method in High Dimensions

Neural Information Processing Systems

NeurIPS 2019 Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center "8670" "Projected Stein Variational Newton: A Fast and Scalable Bayesian Inference Method in High Dimensions" The reviewers agree that this submission represents an important contribution to the field. Please be sure to carefully review and address the concerns of all reviewers in the revision.


GraphStructuredPredictionEnergyNetworks

Neural Information Processing Systems

Specifically,GSPENs combine thecapabilities ofclassicalstructured prediction models andSPENs andhavetheability toexplicitly model localstructure whenknown or assumed, while providing the ability to learn an unknown or more global structure implicitly.



Reviews: Finite-Time Performance Bounds and Adaptive Learning Rate Selection for Two Time-Scale Reinforcement Learning

Neural Information Processing Systems

NeurIPS 2019 Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center "2626" "Finite-Time Performance Bounds and Adaptive Learning Rate Selection for Two Time-Scale Reinforcement Learning" The reviewers unanimously support acceptance. We encourage the authors to strongly consider the suggestions provided by the reviewers for improving a camera ready version.



Reviews: A Primal Dual Formulation For Deep Learning With Constraints

Neural Information Processing Systems

NeurIPS 2019 Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center "6594" "A Primal Dual Formulation For Deep Learning With Constraints" All reviewers were positive about the contributions in the paper so I recommend acceptance. Please take into account all the reviewers' comments when preparing the final version of the paper.