A Multi-resolution Low-rank Tensor Decomposition

Rozada, Sergio, Marques, Antonio G.

arXiv.org Artificial Intelligence 

The PARAFAC decomposition is conceptually simple and its The (efficient and parsimonious) decomposition of higher-order tensors representation complexity scales gracefully (the number of parameters is a fundamental problem with numerous applications in a variety grows linearly with the rank). The Tucker decomposition enjoys of fields. Several methods have been proposed in the literature additional degrees of freedom at the cost of greater complexity (exponential to that end, with the Tucker and PARAFAC decompositions being dependence of the number of parameters with respect to the most prominent ones. Inspired by the latter, in this work the rank). Hierarchical tensor decompositions, such as the Tensor we propose a multi-resolution low-rank tensor decomposition to describe Train (TT) decomposition [8] or a hierarchical Tucker (hTucker) decomposition (approximate) a tensor in a hierarchical fashion. The central [9], try to alleviate this problem. The former unwraps idea of the decomposition is to recast the tensor into multiple lowerdimensional the tensor into a chain of three-dimensional tensors, and the latter tensors to exploit the structure at different levels of resolution.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found