Generalization Performance of Empirical Risk Minimization on Over-parameterized Deep ReLU Nets
Lin, Shao-Bo, Wang, Yao, Zhou, Ding-Xuan
–arXiv.org Artificial Intelligence
In this paper, we study the generalization performance of global minima for implementing empirical risk minimization (ERM) on over-parameterized deep ReLU nets. Using a novel deepening scheme for deep ReLU nets, we rigorously prove that there exist perfect global minima achieving almost optimal generalization error bounds for numerous types of data under mild conditions. Since over-parameterization is crucial to guarantee that the global minima of ERM on deep ReLU nets can be realized by the widely used stochastic gradient descent (SGD) algorithm, our results indeed fill a gap between optimization and generalization.
arXiv.org Artificial Intelligence
Feb-28-2023
- Country:
- Asia > China
- Hong Kong (0.04)
- Shaanxi Province > Xi'an (0.04)
- Europe
- Germany (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- North America > United States (0.04)
- Oceania > Australia
- New South Wales > Sydney (0.04)
- Asia > China
- Genre:
- Research Report > New Finding (0.34)
- Technology: