Gradient Descent Meets Shift-and-Invert Preconditioning for Eigenvector Computation
–Neural Information Processing Systems
Shift-and-invert preconditioning, as a classic acceleration technique for the leading eigenvector computation, has received much attention again recently, owing to fast least-squares solvers for efficiently approximating matrix inversions in power iterations. In this work, we adopt an inexact Riemannian gradient descent perspective to investigate this technique on the effect of the step-size scheme. The shift-and-inverted power method is included as a special case with adaptive step-sizes. Particularly, two other step-size settings, i.e., constant step-sizes and Barzilai-Borwein (BB) step-sizes, are examined theoretically and/or empirically.
Neural Information Processing Systems
Nov-18-2025, 02:12:52 GMT
- Country:
- Asia
- China (0.04)
- Middle East > Jordan (0.04)
- Europe
- France > Hauts-de-France
- Spain
- Canary Islands (0.04)
- Catalonia > Barcelona Province
- Barcelona (0.04)
- North America
- Canada
- British Columbia (0.04)
- Quebec > Montreal (0.04)
- United States
- Canada
- Asia
- Genre:
- Research Report > New Finding (0.46)
- Technology: