A regret minimization approach to fixed-point iterations
–arXiv.org Artificial Intelligence
We propose a conversion scheme that turns regret minimizing algorithms into fixed point iterations, with convergence guarantees following from regret bounds. The resulting iterations can be seen as a grand extension of the classical Krasnoselskii--Mann iterations, as the latter are recovered by converting the Online Gradient Descent algorithm. This approach yields new simple iterations for finding fixed points of non-self operators. We also focus on converting algorithms from the AdaGrad family of regret minimizers, and thus obtain fixed point iterations with adaptive guarantees of a new kind. Numerical experiments on various problems demonstrate faster convergence of AdaGrad-based fixed point iterations over Krasnoselskii--Mann iterations.
arXiv.org Artificial Intelligence
Sep-29-2025
- Country:
- Asia > Middle East
- Israel > Jerusalem District > Jerusalem (0.04)
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America > United States
- California (0.04)
- South America > Chile
- Asia > Middle East
- Genre:
- Instructional Material > Course Syllabus & Notes (0.46)
- Research Report (0.63)
- Industry:
- Education (0.46)
- Technology: