Differentially Private Block-wise Gradient Shuffle for Deep Learning
–arXiv.org Artificial Intelligence
Traditional Differentially Private Stochastic Gradient Descent (DP-SGD) introduces statistical noise on top of gradients drawn from a Gaussian distribution to ensure privacy. This paper introduces the novel Differentially Private Block-wise Gradient Shuffle (DP-BloGS) algorithm for deep learning. BloGS builds off of existing private deep learning literature, but makes a definitive shift by taking a probabilistic approach to gradient noise introduction through shuffling modeled after information theoretic privacy analyses. The theoretical results presented in this paper show that the combination of shuffling, parameter-specific block size selection, batch layer clipping, and gradient accumulation allows DP-BloGS to achieve training times close to that of non-private training while maintaining similar privacy and utility guarantees to DP-SGD. DP-BloGS is found to be significantly more resistant to data extraction attempts than DP-SGD. The theoretical results are validated by the experimental findings.
arXiv.org Artificial Intelligence
Jul-31-2024
- Country:
- Asia > Middle East
- Jordan (0.04)
- North America > United States
- Pennsylvania > Allegheny County > Pittsburgh (0.04)
- Asia > Middle East
- Genre:
- Research Report
- Experimental Study (0.46)
- New Finding (0.45)
- Research Report
- Industry:
- Information Technology > Security & Privacy (1.00)
- Technology: