stochastic chebyshev gradient descent
Reviews: Stochastic Chebyshev Gradient Descent for Spectral Optimization
Spectral optimization is defined as finding \theta that minimizes F(A(\theta)) g(\theta) where A(\theta) is a symmetric matrix and F typically the trace of an analytic function i.e. F(A) tr(p(A)) where p is a polynomial. They propose an unbiased estimator of F by randomly truncating the Chebyshev approximation to F and doing importance sampling. Moreover, they calculate the optimal distribution for this importance sampling. They demonstrate how this method would be used for SGD and stochastic Variance Reduced Gradient.
Stochastic Chebyshev Gradient Descent for Spectral Optimization
Han, Insu, Avron, Haim, Shin, Jinwoo
A large class of machine learning techniques requires the solution of optimization problems involving spectral functions of parametric matrices, e.g. Unfortunately, computing the gradient of a spectral function is generally of cubic complexity, as such gradient descent methods are rather expensive for optimizing objectives involving the spectral function. Thus, one naturally turns to stochastic gradient methods in hope that they will provide a way to reduce or altogether avoid the computation of full gradients. However, here a new challenge appears: there is no straightforward way to compute unbiased stochastic gradients for spectral functions. In this paper, we develop unbiased stochastic gradients for spectral-sums, an important subclass of spectral functions.