First-order methods for stochastic and finite-sum convex optimization with deterministic constraints

Lu, Zhaosong, Xiao, Yifeng

arXiv.org Artificial Intelligence 

In this paper, we study a class of stochastic and finite-sum convex optimization problems with deterministic constraints. Existing methods typically aim to find an $ε$-$expectedly\ feasible\ stochastic\ optimal$ solution, in which the expected constraint violation and expected optimality gap are both within a prescribed tolerance $ε$. However, in many practical applications, constraints must be nearly satisfied with certainty, rendering such solutions potentially unsuitable due to the risk of substantial violations. To address this issue, we propose stochastic first-order methods for finding an $ε$-$surely\ feasible\ stochastic\ optimal$ ($ε$-SFSO) solution, where the constraint violation is deterministically bounded by $ε$ and the expected optimality gap is at most $ε$. Our methods apply an accelerated stochastic gradient (ASG) scheme or a modified variance-reduced ASG scheme $only\ once$ to a sequence of quadratic penalty subproblems with appropriately chosen penalty parameters. We establish first-order oracle complexity bounds for the proposed methods in computing an $ε$-SFSO solution. As a byproduct, we also derive first-order oracle complexity results for sample average approximation method in computing an $ε$-SFSO solution of the stochastic optimization problem using our proposed methods to solve the sample average problem.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found