Adaptive Accelerated Gradient Converging Method under H\"{o}lderian Error Bound Condition
–Neural Information Processing Systems
Recent studies have shown that proximal gradient (PG) method and accelerated gradient method (APG) with restarting can enjoy a linear convergence under a weaker condition than strong convexity, namely a quadratic growth condition (QGC). However, the faster convergence of restarting APG method relies on the potentially unknown constant in QGC to appropriately restart APG, which restricts its applicability.
Neural Information Processing Systems
Nov-21-2025, 05:57:53 GMT
- Country:
- North America > United States
- California > Los Angeles County
- Long Beach (0.04)
- Iowa > Johnson County
- Iowa City (0.14)
- California > Los Angeles County
- North America > United States
- Genre:
- Research Report (0.66)
- Technology: