DOPPLER: Differentially Private Optimizers with Low-pass Filter for Privacy Noise Reduction Xinwei Zhang University of Southern California Zhiqi Bu
–Neural Information Processing Systems
Privacy is a growing concern in modern deep-learning systems and applications. Differentially private (DP) training prevents the leakage of sensitive information in the collected training data from the trained machine learning models. DP op-timizers, including DP stochastic gradient descent (DPSGD) and its variants, privatize the training procedure by gradient clipping and DP noise injection. However, in practice, DP models trained using DPSGD and its variants often suffer from significant model performance degradation. Such degradation prevents the application of DP optimization in many key tasks, such as foundation model pre-training.
Neural Information Processing Systems
Mar-20-2025, 08:16:12 GMT
- Country:
- North America > United States > California (0.86)
- Genre:
- Research Report > Experimental Study (0.93)
- Industry:
- Technology: