Reviews: Tight Dimensionality Reduction for Sketching Low Degree Polynomial Kernels
–Neural Information Processing Systems
This work achieves an improved bound on the sample complexity of random tensor projection and it is argued that this bound is tight and nearly optimal. A key observation is to view the random sketch as a bilinear form of a random matrix. It makes the analysis suitable to apply general matrix concentration inequalities. The authors can obtain better bounds by analyzing both operator and Frobenius norm of the random matrix, which is the key challenges of this work. Their proof techniques are different from previous approaches but very impressive.
matrix concentration inequality, sketching low degree polynomial kernel, tight dimensionality reduction, (3 more...)
Neural Information Processing Systems
Jan-26-2025, 17:04:19 GMT
- Technology: