SCRN escape saddle-points and converge to local minimizers faster under Strong Growth Condition (SGC) (which
–Neural Information Processing Systems
We thank all the reviewers for their valuable comments. Prior works (e.g., [VBS18]) considered only convergence to critical We provide our results in both the zeroth and higher order settings. SGC assumption for unbounded functions, which was not done before in the literature. SCRN is also significantly involved under SGC (especially in zeroth-order setup); see also Remark 6 and 7. Please see Lines 2-10 above. However, the method in [AL18] is a theoretical computer science style reduction approach.
Neural Information Processing Systems
Aug-15-2025, 03:30:00 GMT