Faster Algorithms for User-Level Private Stochastic Convex Optimization
–Neural Information Processing Systems
We study private stochastic convex optimization (SCO) under user-level differential privacy (DP) constraints. In this setting, there are n users (e.g., cell phones), each possessing m data items (e.g., text messages), and we need to protect the privacy of each user's entire collection of data items. Existing algorithms for user-level DP SCO are impractical in many large-scale machine learning scenarios because: (i) they make restrictive assumptions on the smoothness parameter of the loss function and require the number of users to grow polynomially with the dimension of the parameter space; or (ii) they are prohibitively slow, requiring at least (mn) {3/2} gradient computations for smooth losses and (mn) 3 computations for non-smooth losses. To address these limitations, we provide novel user-level DP algorithms with state-of-the-art excess risk and runtime guarantees, without stringent assumptions. First, we develop a linear-time algorithm with state-of-the-art excess risk (for a non-trivial linear-time algorithm) under a mild smoothness assumption.
Neural Information Processing Systems
Mar-17-2025, 14:26:45 GMT
- Technology: