Information-Theoretic Lower Bounds for Zero-Order Stochastic Gradient Estimation

Alabdulkareem, Abdulrahman, Honorio, Jean

arXiv.org Machine Learning 

In this paper we analyze the necessary number of samples to estimate the gradient of any multidimensional smooth (possibly non-convex) function in a zero-order stochastic oracle model. In this model, an estimator has access to noisy values of the function, in order to produce the estimate of the gradient. We also provide an analysis on the sufficient number of samples for the finite difference method, a classical technique in numerical linear algebra. For $T$ samples and $d$ dimensions, our information-theoretic lower bound is $\Omega(\sqrt{d/T})$. We show that the finite difference method has rate $O(d^{4/3}/\sqrt{T})$ for functions with zero third and higher order derivatives. Thus, the finite difference method is not minimax optimal, and therefore there is space for the development of better gradient estimation methods.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found