Finite-Sample Symmetric Mean Estimation with Fisher Information Rate
Gupta, Shivam, Lee, Jasper C. H., Price, Eric
–arXiv.org Artificial Intelligence
The mean of an unknown variance-$\sigma^2$ distribution $f$ can be estimated from $n$ samples with variance $\frac{\sigma^2}{n}$ and nearly corresponding subgaussian rate. When $f$ is known up to translation, this can be improved asymptotically to $\frac{1}{n\mathcal I}$, where $\mathcal I$ is the Fisher information of the distribution. Such an improvement is not possible for general unknown $f$, but [Stone, 1975] showed that this asymptotic convergence $\textit{is}$ possible if $f$ is $\textit{symmetric}$ about its mean. Stone's bound is asymptotic, however: the $n$ required for convergence depends in an unspecified way on the distribution $f$ and failure probability $\delta$. In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every $f, n, \delta$ with $n > \log \frac{1}{\delta}$, we get convergence close to a subgaussian with variance $\frac{1}{n \mathcal I_r}$, where $\mathcal I_r$ is the $r$-$\textit{smoothed}$ Fisher information with smoothing radius $r$ that decays polynomially in $n$. Such a bound essentially matches the finite-sample guarantees in the known-$f$ setting.
arXiv.org Artificial Intelligence
Jun-28-2023
- Country:
- Asia
- Afghanistan > Parwan Province
- Charikar (0.04)
- India > West Bengal
- Kolkata (0.04)
- Afghanistan > Parwan Province
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America > United States
- Texas > Travis County
- Austin (0.04)
- Wisconsin > Dane County
- Madison (0.04)
- Texas > Travis County
- Asia
- Genre:
- Research Report (0.64)
- Technology: