hopl
Reviews: Low-Rank Regression with Tensor Responses
Strength: --The paper provides the theoretical analysis of approximation guarantees and a generalization bound for the class of tensor-valued regression functions. Weakness: --A major drawback is that the novelty and contribution is rather limited. The key idea and the model of this paper is actually equivalent to the HOPLS in the following paper: [Zhao et. In HOPLS, it assumes the tensor input has low-rank structure and also the tensor output has low-rank structure, and the link of them is established in the common latent space. And then follows a regression step against the projected latent variables.
Multilinear Subspace Regression: An Orthogonal Tensor Decomposition Approach
A multilinear subspace regression model based on so called latent variable decomposition is introduced. Unlike standard regression methods which typically employ matrix (2D) data representations followed by vector subspace transformations, the proposed approach uses tensor subspace transformations to model common latent variables across both the independent and dependent data. The proposed approach aims to maximize the correlation between the so derived latent variables and is shown to be suitable for the prediction of multidimensional dependent data from multidimensional independent data, where for the estimation of the latent variables we introduce an algorithm based on Multilinear Singular Value Decomposition (MSVD) on a specially defined cross-covariance tensor. It is next shown that in this way we are also able to unify the existing Partial Least Squares (PLS) and N-way PLS regression algorithms within the same framework. Simulations on benchmark synthetic data confirm the advantages of the proposed approach, in terms of its predictive ability and robustness, especially for small sample sizes. The potential of the proposed technique is further illustrated on a real world task of the decoding of human intracranial electrocorticogram (ECoG) from a simultaneously recorded scalp electroencephalograph (EEG).
- Africa > Senegal > Kolda Region > Kolda (0.04)
- South America > Argentina (0.04)
- Europe > Germany (0.04)
- (2 more...)
- Health & Medicine > Therapeutic Area > Neurology (0.69)
- Health & Medicine > Health Care Technology (0.69)
Higher-Order Partial Least Squares (HOPLS): A Generalized Multi-Linear Regression Method
Zhao, Qibin, Caiafa, Cesar F., Mandic, Danilo P., Chao, Zenas C., Nagasaka, Yasuo, Fujii, Naotaka, Zhang, Liqing, Cichocki, Andrzej
A new generalized multilinear regression model, termed the Higher-Order Partial Least Squares (HOPLS), is introduced with the aim to predict a tensor (multiway array) $\tensor{Y}$ from a tensor $\tensor{X}$ through projecting the data onto the latent space and performing regression on the corresponding latent variables. HOPLS differs substantially from other regression models in that it explains the data by a sum of orthogonal Tucker tensors, while the number of orthogonal loadings serves as a parameter to control model complexity and prevent overfitting. The low dimensional latent space is optimized sequentially via a deflation operation, yielding the best joint subspace approximation for both $\tensor{X}$ and $\tensor{Y}$. Instead of decomposing $\tensor{X}$ and $\tensor{Y}$ individually, higher order singular value decomposition on a newly defined generalized cross-covariance tensor is employed to optimize the orthogonal loadings. A systematic comparison on both synthetic data and real-world decoding of 3D movement trajectories from electrocorticogram (ECoG) signals demonstrate the advantages of HOPLS over the existing methods in terms of better predictive ability, suitability to handle small sample sizes, and robustness to noise.
- Africa > Senegal > Kolda Region > Kolda (0.04)
- Asia > Middle East > Republic of Türkiye > Bingoel Province > Bingol (0.04)
- Asia > Japan > Honshū > Kantō > Saitama Prefecture > Saitama (0.04)
- (6 more...)
- Health & Medicine > Therapeutic Area > Neurology (1.00)
- Health & Medicine > Health Care Technology (0.68)
Multilinear Subspace Regression: An Orthogonal Tensor Decomposition Approach
Zhao, Qibin, Caiafa, Cesar F., Mandic, Danilo P., Zhang, Liqing, Ball, Tonio, Schulze-bonhage, Andreas, Cichocki, Andrzej S.
A multilinear subspace regression model based on so called latent variable decomposition isintroduced. Unlike standard regression methods which typically employ matrix (2D) data representations followed by vector subspace transformations, theproposed approach uses tensor subspace transformations to model common latent variables across both the independent and dependent data. The proposed approach aims to maximize the correlation between the so derived latent variablesand is shown to be suitable for the prediction of multidimensional dependent data from multidimensional independent data, where for the estimation of the latent variables we introduce an algorithm based on Multilinear Singular Value Decomposition (MSVD) on a specially defined cross-covariance tensor. It is next shown that in this way we are also able to unify the existing Partial Least Squares (PLS) and N-way PLS regression algorithms within the same framework. Simulations on benchmark synthetic data confirm the advantages of the proposed approach, in terms of its predictive ability and robustness, especially for small sample sizes. The potential of the proposed technique is further illustrated on a real world task of the decoding of human intracranial electrocorticogram (ECoG) from a simultaneously recorded scalp electroencephalograph (EEG).
- Africa > Senegal > Kolda Region > Kolda (0.04)
- South America > Argentina (0.04)
- Europe > Germany (0.04)
- (2 more...)
- Health & Medicine > Therapeutic Area > Neurology (0.69)
- Health & Medicine > Health Care Technology (0.69)