Uncertainty quantification for nonconvex tensor completion: Confidence intervals, heteroscedasticity and optimality
Cai, Changxiao, Poor, H. Vincent, Chen, Yuxin
In many practical scenarios of interest, however, we do not have full access to a large-dimensional tensor of interest, as only a sampling of its entries are revealed to us; yet we would still wish to reliably infer all missing data. This task, commonly referred to as tensor completion, finds applications in numerous domains including medical imaging [SHKM14], visual data analysis [LMWY13], seismic data reconstruction [KSS13], to name just a few. In order to make meaningful inference about the unseen entries, additional information about the unknown tensor plays a pivotal role (otherwise one is in the position with fewer equations than unknowns). A common type of such prior information is low-rank structure, which hypothesizes that the unknown tensor is decomposable into the superposition of a few rank-one tensors. Substantial attempts have been made in the past few years to understand and tackle such low-rank tensor completion problems. To set the stage for a formal discussion, we formulate the problem as follows.
Jun-15-2020