Nonasymptotic CLT and Error Bounds for Two-Time-Scale Stochastic Approximation
Kong, Seo Taek, Zeng, Sihan, Doan, Thinh T., Srikant, R.
–arXiv.org Artificial Intelligence
We consider linear two-time-scale stochastic approximation algorithms driven by martingale noise. Recent applications in machine learning motivate the need to understand finite-time error rates, but conventional stochastic approximation analysis focus on either asymptotic convergence in distribution or finite-time bounds that are far from optimal. Prior work on asymptotic central limit theorems (CLTs) suggest that two-time-scale algorithms may be able to achieve $1/\sqrt{n}$ error in expectation, with a constant given by the expected norm of the limiting Gaussian vector. However, the best known finite-time rates are much slower. We derive the first non-asymptotic central limit theorem with respect to the Wasserstein-1 distance for two-time-scale stochastic approximation with Polyak-Ruppert averaging. As a corollary, we show that expected error achieved by Polyak-Ruppert averaging decays at rate $1/\sqrt{n}$, which significantly improves on the rates of convergence in prior works.
arXiv.org Artificial Intelligence
Feb-13-2025
- Country:
- Genre:
- Research Report (0.64)
- Industry:
- Banking & Finance (0.67)
- Technology: