Agnostic PAC-Learning of Functions on Analog Neural Nets
–Neural Information Processing Systems
There exist a number of negative results ([J), [BR), [KV]) about learning on neural nets in Valiant's model [V) for probably approx(cid:173) imately correct learning ("PAC-learning"). These negative results are based on an asymptotic analysis where one lets the number of nodes in the neural net go to infinit.y. Hence this analysis is less ad(cid:173) equate for the investigation of learning on a small fixed neural net. The latter type of learning problem gives rise to a different kind of asymptotic question: Can the true error of the neural net be brought arbitrarily close to that of a neural net with "optimal" weights through sufficiently long training? In this paper we employ some new arguments ill order to give a positive answer to this question in Haussler's rather realistic refinement of Valiant's model for PAC-learning ([H), [KSS)). In this more realistic model no a-priori assumptions are required about the "learning target", noise is permitted in the training data, and the inputs and outputs are not restricted to boolean values.
Neural Information Processing Systems
Apr-6-2023, 18:57:56 GMT
- Technology: