Generalized Nonnegative Structured Kruskal Tensor Regression

Wang, Xinjue, Ollila, Esa, Vorobyov, Sergiy A., Mian, Ammar

arXiv.org Machine Learning 

Tensor decompositions have emerged as powerful analytical tools across diverse fields including signal/image processing [1, 2, 3, 4, 5, 6], chemometrics [7], geophysics [8, 9], and neuroscience [10, 11, 12, 13]. Their effectiveness stems from their ability to approximate high-dimensional tensors with low-rank decompositions, offering efficient dimensionality reduction while preserving essential structural information. For example, tensor decomposition techniques [14, 15, 16] are applied in hyperspectral image (HSI) analysis to extract low-rank structures for dimensionality reduction [9], and used in electroencephalogram (EEG) analysis to capture latent patterns across multiple dimensions [10]. Over the past decade, tensor regression (TR) models have received attention, with numerous approaches proposed in the literature, including Tucker tensor regression [17], low-rank orthogonally decomposable tensor regression [18], Bayesian Kruskal tensor regression (KTR) [19], Bayesian low rank tensor ring completion [20], graph-regularized tensor regression [21], and tensor regression network [22].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found