A Gradient Sampling Method With Complexity Guarantees for Lipschitz Functions in High and Low Dimensions
–Neural Information Processing Systems
Their method is a novel modification of Goldstein's classical subgradient method. Their work, however, makes use of a nonstandard subgradient oracle model and requires the function to be directionally differentiable. Our first contribution in this paper is to show that both of these assumptions can be dropped by simply adding a small random perturbation in each step of their algorithm. The resulting method works on any Lipschitz function whose value and gradient can be evaluated at points of differentiability.
Neural Information Processing Systems
May-29-2025, 02:43:08 GMT