Guaranteed Noisy CP Tensor Recovery via Riemannian Optimization on the Segre Manifold
Recovering a low-CP-rank tensor from noisy linear measurements is a central challenge in high-dimensional data analysis, with applications spanning tensor PCA, tensor regression, and beyond. We exploit the intrinsic geometry of rank-one tensors by casting the recovery task as an optimization problem over the Segre manifold, the smooth Riemannian manifold of rank-one tensors. This geometric viewpoint yields two powerful algorithms: Riemannian Gradient Descent (RGD) and Riemannian Gauss-Newton (RGN), each of which preserves feasibility at every iteration. Under mild noise assumptions, we prove that RGD converges at a local linear rate, while RGN exhibits an initial local quadratic convergence phase that transitions to a linear rate as the iterates approach the statistical noise floor. Extensive synthetic experiments validate these convergence guarantees and demonstrate the practical effectiveness of our methods.
Oct-2-2025
- Country:
- Africa > Senegal
- Kolda Region > Kolda (0.04)
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- Africa > Senegal
- Genre:
- Research Report > New Finding (0.46)
- Technology: