Concentration inequalities under sub-Gaussian and sub-exponential conditions
–Neural Information Processing Systems
We prove analogues of the popular bounded difference inequality (also called McDiarmid's inequality) for functions of independent random variables under sub-Gaussian and sub-exponential conditions. Applied to vector-valued concentration and the method of Rademacher complexities these inequalities allow an easy extension of uniform convergence results for PCA and linear regression to the case potentially unbounded input-and output variables.
Neural Information Processing Systems
Mar-18-2025, 23:37:03 GMT
- Technology: