Goto

Collaborating Authors

 spkde


Robust Kernel Density Estimation by Scaling and Projection in Hilbert Space

Robert A. Vandermeulen, Clayton Scott

Neural Information Processing Systems

While robust parameter estimation has been well studied in parametric density estimation, there has been little investigation into robust density estimation in the nonparametric setting. We present a robust version of the popular kernel density estimator (KDE). As with other estimators, a robust version of the KDE is useful since sample contamination is a common issue with datasets. What "robustness" means for a nonparametric density estimate is not straightforward and is a topic we explore in this paper.


Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems

Assumption A has two parts: 1) f_con is uniform on the support of f_tar, and 2) The values of f_con outside of supp(f_tar) is bounded above by its value on supp(f_tar). It seems like both of these are important to the method.


Robust Kernel Density Estimation by Scaling and Projection in Hilbert Space

Neural Information Processing Systems

While robust parameter estimation has been well studied in parametric density estimation, there has been little investigation into robust density estimation in the nonparametric setting. We present a robust version of the popular kernel density estimator (KDE). As with other estimators, a robust version of the KDE is useful since sample contamination is a common issue with datasets. What "robustness" means for a nonparametric density estimate is not straightforward and is a topic we explore in this paper.


Robust Kernel Density Estimation with Median-of-Means principle

Humbert, Pierre, Bars, Batiste Le, Minvielle, Ludovic, Vayatis, Nicolas

arXiv.org Machine Learning

Over the past years, the task of learning in the presence of outliers has become an increasingly important objective in both statistics and machine learning. Indeed, in many situations, training data can be contaminated by undesired samples, which may badly affect the resulting learning task, especially in adversarial settings. Building robust estimators and algorithms that are resilient to outliers is therefore becoming crucial in many learning procedures. In particular, the inference of a probability density function from a contaminated random sample is of major concerns. Density estimation methods are mostly divided into parametric and nonparametric techniques. Among the nonparametric family, the Kernel Density Estimator (KDE) is probably the most known and used for both univariate and multivariate densities [Parzen, 1962; Silverman, 1986; Scott, 2015], but it also known to be sensitive to dataset contaminated by outliers [Kim and Scott, 2011, 2012; Vandermeulen and Scott, 2014].


Robust Kernel Density Estimation by Scaling and Projection in Hilbert Space

Vandermeulen, Robert A., Scott, Clayton

Neural Information Processing Systems

While robust parameter estimation has been well studied in parametric density estimation, there has been little investigation into robust density estimation in the nonparametric setting. We present a robust version of the popular kernel density estimator (KDE). As with other estimators, a robust version of the KDE is useful since sample contamination is a common issue with datasets. What ``robustness'' means for a nonparametric density estimate is not straightforward and is a topic we explore in this paper. To construct a robust KDE we scale the traditional KDE and project it to its nearest weighted KDE in the $L^2$ norm. Because the squared $L^2$ norm penalizes point-wise errors superlinearly this causes the weighted KDE to allocate more weight to high density regions. We demonstrate the robustness of the SPKDE with numerical experiments and a consistency result which shows that asymptotically the SPKDE recovers the uncontaminated density under sufficient conditions on the contamination.


Robust Kernel Density Estimation by Scaling and Projection in Hilbert Space

Vandermeulen, Robert A., Scott, Clayton D.

arXiv.org Machine Learning

While robust parameter estimation has been well studied in parametric density estimation, there has been little investigation into robust density estimation in the nonparametric setting. We present a robust version of the popular kernel density estimator (KDE). As with other estimators, a robust version of the KDE is useful since sample contamination is a common issue with datasets. What "robustness" means for a nonparametric density estimate is not straightforward and is a topic we explore in this paper. To construct a robust KDE we scale the traditional KDE and project it to its nearest weighted KDE in the $L^2$ norm. This yields a scaled and projected KDE (SPKDE). Because the squared $L^2$ norm penalizes point-wise errors superlinearly this causes the weighted KDE to allocate more weight to high density regions. We demonstrate the robustness of the SPKDE with numerical experiments and a consistency result which shows that asymptotically the SPKDE recovers the uncontaminated density under sufficient conditions on the contamination.