Goto

Collaborating Authors

 woodruff



Hardness of Low Rank Approximation of Entrywise Transformed Matrix Products

Neural Information Processing Systems

Some related lower bounds include the work of Backurs et al. [2017] that solving kernel Support V ector Machines (SVM), ridge regression, or Principal Component Analysis (PCA) problems to high accuracy or approximating kernel density estimates up to a constant factor for kernels with






Total Least Squares Regression in Input Sparsity Time

Huaian Diao, Zhao Song, David Woodruff, Xin Yang

Neural Information Processing Systems

In the total least squares problem, one is given an m n matrix A, and an m d matrix B, and one seeks to "correct" both A and B, obtaining matrices  and B, so that there exists an X satisfying the equation ÂX = B. Typically the problem is overconstrained, meaning that m max(n, d).