Appendices Appendices 1 A Basic Facts about Gaussian Distributions 2 B An Improved Analysis of NA-Hutch+ + 3 C Lower Bounds 8 C.1 Case 1: Lower Bound for Small ɛ
–Neural Information Processing Systems
We let G N (n) denote an n n random Gaussian matrix with i.i.d. Then Rg has the same distribution as g. Recall that NA-Hutch++ splits its matrix-vector queries between computing an O(1)- approximate rank-k approximation à and performing Hutchinson's estimate on the residual matrix A Ã. The key to an improved query complexity of NA-Hutch++ is on the analysis of the size of random Gaussian sketching matrices S, R in Algorithm 1 that one needs to get an O(1)-approximate rank-k approximation à in the Frobenius norm. To get the desired rank-k approximation, we need S and R to satisfy two properties: 1) subspace embedding as in Lemma 3.3 and 2) approximate matrix product for orthogonal subspaces as in Lemma 3.4. Specifically, we show in Lemma 3.4 that choosing S and R to be of size O(k + log(1/δ)) suffices to get the second property with probability 1 δ.
Neural Information Processing Systems
Feb-10-2025, 17:16:07 GMT