Differentially Private Learning Needs Hidden State (Or Much Faster Convergence)

Neural Information Processing Systems 

The main-stream analysis of privacy loss in DP-SGD is based on composition theorems, which quantify the total privacy loss of the training process across all its iterations.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found