Robust Kernel Density Estimation with Median-of-Means principle

Humbert, Pierre, Bars, Batiste Le, Minvielle, Ludovic, Vayatis, Nicolas

arXiv.org Machine Learning 

Over the past years, the task of learning in the presence of outliers has become an increasingly important objective in both statistics and machine learning. Indeed, in many situations, training data can be contaminated by undesired samples, which may badly affect the resulting learning task, especially in adversarial settings. Building robust estimators and algorithms that are resilient to outliers is therefore becoming crucial in many learning procedures. In particular, the inference of a probability density function from a contaminated random sample is of major concerns. Density estimation methods are mostly divided into parametric and nonparametric techniques. Among the nonparametric family, the Kernel Density Estimator (KDE) is probably the most known and used for both univariate and multivariate densities [Parzen, 1962; Silverman, 1986; Scott, 2015], but it also known to be sensitive to dataset contaminated by outliers [Kim and Scott, 2011, 2012; Vandermeulen and Scott, 2014].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found