Approximating $f$-Divergences with Rank Statistics
Stein, Viktor, de Frutos, José Manuel
We introduce a rank-statistic approximation of $f$-divergences that avoids explicit density-ratio estimation by working directly with the distribution of ranks. For a resolution parameter $K$, we map the mismatch between two univariate distributions $μ$ and $ν$ to a rank histogram on $\{ 0, \ldots, K\}$ and measure its deviation from uniformity via a discrete $f$-divergence, yielding a rank-statistic divergence estimator. We prove that the resulting estimator of the divergence is monotone in $K$, is always a lower bound of the true $f$-divergence, and we establish quantitative convergence rates for $K\to\infty$ under mild regularity of the quantile-domain density ratio. To handle high-dimensional data, we define the sliced rank-statistic $f$-divergence by averaging the univariate construction over random projections, and we provide convergence results for the sliced limit as well. We also derive finite-sample deviation bounds along with asymptotic normality results for the estimator. Finally, we empirically validate the approach by benchmarking against neural baselines and illustrating its use as a learning objective in generative modelling experiments.
Feb-2-2026
- Country:
- Asia
- Middle East > Jordan (0.04)
- Russia (0.04)
- Europe
- Germany > Berlin (0.04)
- Russia (0.04)
- Spain > Galicia
- Madrid (0.04)
- Switzerland (0.04)
- Ukraine > Kharkiv Oblast
- Kharkiv (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Oxfordshire > Oxford (0.14)
- North America > United States
- Michigan (0.04)
- New York > New York County
- New York City (0.04)
- Asia
- Genre:
- Research Report (0.50)
- Industry:
- Government > Regional Government (0.46)
- Technology: