Cryptographic Hardness of Learning Halfspaces with Massart Noise
–Neural Information Processing Systems
We study the complexity of PAC learning halfspaces in the presence of Massart noise. In this problem, we are given i.i.d. The goal of the learner is to compute a hypothesis with small 0-1 error. Our main result is the first computational hardness result for this learning problem. Specifically, assuming the (widely believed) subexponential-time hardness of the Learning with Errors (LWE) problem, we show that no polynomial-time Massart halfspace learner can achieve error better than \Omega(\eta), even if the optimal 0-1 error is small, namely \mathrm{OPT} 2 {-\log {c} (N)} for any universal constant c \in (0, 1) .
Neural Information Processing Systems
Oct-9-2024, 23:55:12 GMT
- Technology: