Learning Noisy Halfspaces with a Margin: Massart is No Harder than Random
–Neural Information Processing Systems
We study the problem of PAC learning \gamma -margin halfspaces with Massart noise. We propose a simple proper learning algorithm, the Perspectron, that has sample complexity \widetilde{O}((\epsilon\gamma) {-2}) and achieves classification error at most \eta \epsilon where \eta is the Massart noise rate. Prior works (DGT19, CKMY20) came with worse sample complexity guarantees (in both \epsilon and \gamma) or could only handle random classification noise (DDKWZ23,KITBMV23)--- a much milder noise assumption. We also show that our results extend to the more challenging setting of learning generalized linear models with a known link function under Massart noise, achieving a similar sample complexity to the halfspace case. This significantly improves upon the prior state-of-the-art in this setting due to CKMY20, who introduced this model.
Neural Information Processing Systems
May-27-2025, 01:07:50 GMT
- Technology: