Tight Bounds on the Binomial CDF, and the Minimum of i.i.d Binomials, in terms of KL-Divergence

Zhu, Xiaohan, Ohannessian, Mesrob I., Srebro, Nathan

arXiv.org Machine Learning 

We provide finite sample upper and lower bounds on the Binomial tail probability which are a direct application of Sanov's theorem. We then use these to obtain high probability upper and lower bounds on the minimum of i.i.d. Both bounds are finite sample, asymptotically tight, and expressed in terms of the KL-divergence. The purpose of this note is to provide, in a self-contained and concise way, both upper and lower bounds on the Binomial tail, and through that, on the minimum of i.i.d. The upper bound on the minimum of i.i.d.