Reviews: Sublinear Time Orthogonal Tensor Decomposition
–Neural Information Processing Systems
This paper presents a randomized method for decomposing a symmetric, orthogonal tensor that is "nearly low rank". The fact that the tensor is symmetric and composed of orthogonal components may sound restrictive, but this is exactly the sort of tensor decomposition problem that comes up in spectral methods that solve Latent Dirichlet Allocation problems. So, it's an interesting model to study. The method introduced very closely follows work in a paper from NIPS 2015, "Fast and guaranteed tensor decomposition via sketching" which uses a standard tensor power method for decomposition, but speeds up each iteration by accelerating the required tensor contractions (vector dot products against the tensor) via randomized sketching techniques. This paper uses the same algorithm, but instead of using sketches based on "random projections", which take random linear combinations of all of the components of the tensor, it uses a subsampling technique to approximate the contractions. The authors main selling point is that this approach should allow for better running times since they don't have to touch every entry in the tensor with each iteration.
Neural Information Processing Systems
Jan-20-2025, 08:09:40 GMT
- Technology: