Tensor train completion: local recovery guarantees via Riemannian optimization
Budzinskiy, Stanislav, Zamarashkin, Nikolai
–arXiv.org Artificial Intelligence
The problem of recovering algebraically structured data from scarce measurements has already become a classic one. The data under consideration are typically sparse vectors or low-rank matrices and tensors, while the measurements are obtained by applying a linear operator that satisfies a variant of the so-called restricted isometry property (RIP) [1]. In this work, we focus on tensor completion, which consists in recovering a tensor in the tensor train (TT) format [2, 3] from a small subset of its entries. Specifically, we consider it as a Riemannian optimization problem [4, 5] on the smooth manifold of tensors with fixed TT ranks and derive sufficient conditions (essentially, the RIP) for local convergence of the Riemannian gradient descent. We further estimate the number of randomly selected entries of a tensor with low TT ranks that is sufficient for the RIP to hold with high probability and, as a consequence, for the Riemannian gradient descent to converge locally.
arXiv.org Artificial Intelligence
Aug-30-2023