Fast Rates in Stochastic Online Convex Optimization by Exploiting the Curvature of Feasible Sets
–Neural Information Processing Systems
In this work, we explore online convex optimization (OCO) and introduce a new condition and analysis that provides fast rates by exploiting the curvature of feasible sets. In online linear optimization, it is known that if the average gradient of loss functions exceeds a certain threshold, the curvature of feasible sets can be exploited by the follow-the-leader (FTL) algorithm to achieve a logarithmic regret. This study reveals that algorithms adaptive to the curvature of loss functions can also leverage the curvature of feasible sets. In particular, we first prove that if an optimal decision is on the boundary of a feasible set and the gradient of an underlying loss function is non-zero, then the algorithm achieves a regret bound of O(\rho \log T) in stochastic environments. Here, \rho 0 is the radius of the smallest sphere that includes the optimal decision and encloses the feasible set.
Neural Information Processing Systems
May-27-2025, 14:02:28 GMT
- Technology: