Regularized Loss Minimizers with Local Data Perturbation: Consistency and Data Irrecoverability

Li, Zitao, Honorio, Jean

arXiv.org Machine Learning 

We show that there are several regularized loss minimization problems that can use locally perturbed data with theoretical guarantees of generalization, i.e., loss consistency. Our results quantitatively connect the convergence rates of the learning problems to the impossibility for any adversary for recovering the original data from perturbed observations. To this end, we introduce a new concept of data irrecoverability, and show that the well-studied concept of data privacy implies data irrecoverability.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found