Mean and Variance Estimation Complexity in Arbitrary Distributions via Wasserstein Minimization
Iverson, Valentio, Vavasis, Stephen
–arXiv.org Artificial Intelligence
Parameter estimation is a fundamental challenge in machine learning, crucial for tasks such as neural network weight fitting and Bayesian inference. This paper focuses on the complexity of estimating translation $\boldsymbol{\mu} \in \mathbb{R}^l$ and shrinkage $\sigma \in \mathbb{R}_{++}$ parameters for a distribution of the form $\frac{1}{\sigma^l} f_0 \left( \frac{\boldsymbol{x} - \boldsymbol{\mu}}{\sigma} \right)$, where $f_0$ is a known density in $\mathbb{R}^l$ given $n$ samples. We highlight that while the problem is NP-hard for Maximum Likelihood Estimation (MLE), it is possible to obtain $\varepsilon$-approximations for arbitrary $\varepsilon > 0$ within $\text{poly} \left( \frac{1}{\varepsilon} \right)$ time using the Wasserstein distance.
arXiv.org Artificial Intelligence
Jan-17-2025
- Country:
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America
- Canada > Ontario
- Waterloo Region > Waterloo (0.04)
- United States > California
- San Diego County > San Diego (0.04)
- Canada > Ontario
- Europe > United Kingdom
- Genre:
- Research Report (0.50)