Matching the Statistical Query Lower Bound for k -Sparse Parity Problems with Sign Stochastic Gradient Descent
–Neural Information Processing Systems
The k -sparse parity problem is a classical problem in computational complexity and algorithmic theory, serving as a key benchmark for understanding computational classes. In this paper, we solve the k -sparse parity problem with sign stochastic gradient descent, a variant of stochastic gradient descent (SGD) on two-layer fully-connected neural networks. We demonstrate that this approach can efficiently solve the k -sparse parity problem on a d -dimensional hypercube ( k\le O(\sqrt{d})) with a sample complexity of \tilde{O}(d {k-1}) using 2 {\Theta(k)} neurons, matching the established \Omega(d {k}) lower bounds of Statistical Query (SQ) models. Our theoretical analysis begins by constructing a good neural network capable of correctly solving the k -parity problem. We then demonstrate how a trained neural network with sign SGD can effectively approximate this good network, solving the k -parity problem with small statistical errors.
Neural Information Processing Systems
May-27-2025, 16:56:35 GMT
- Technology: