Black-Box Approximation and Optimization with Hierarchical Tucker Decomposition

Ryzhakov, Gleb, Chertkov, Andrei, Basharin, Artem, Oseledets, Ivan

arXiv.org Artificial Intelligence 

Storing such a tensor often requires too much computational effort, and for large values of the dimension d, this is We develop a new method HTBB for the multidimensional completely impossible due to the so-called curse of dimensionality black-box approximation and gradientfree (the memory for storing data and the complexity optimization, which is based on the low-rank of working with it grows exponentially in d). To overcome hierarchical Tucker decomposition with the use it, various compression formats for multidimensional tensors of the MaxVol indices selection procedure. Numerical are proposed: Canonical Polyadic decomposition aka experiments for 14 complex model problems CANDECOMP/PARAFAC (CPD) (Harshman et al., 1970), demonstrate the robustness of the proposed Tucker decomposition (Tucker, 1966), Tensor Train (TT) method for dimensions up to 1000, while it shows decomposition (Oseledets, 2011), Hierarchical Tucker (HT) significantly more accurate results than classical decomposition (Hackbusch & Kühn, 2009; Ballani et al., gradient-free optimization methods, as well as 2013), and their various modifications. These approaches approximation and optimization methods based make it possible to approximately represent the tensor in on the popular tensor train decomposition, which a compact low-rank (i.e., low-parameter) format and then represents a simpler case of a tensor network.