Expressivity and Approximation Properties of Deep Neural Networks with ReLU$^k$ Activation
He, Juncai, Mao, Tong, Xu, Jinchao
–arXiv.org Artificial Intelligence
In this paper, we investigate the expressivity and approximation properties of deep neural networks employing the ReLU$^k$ activation function for $k \geq 2$. Although deep ReLU networks can approximate polynomials effectively, deep ReLU$^k$ networks have the capability to represent higher-degree polynomials precisely. Our initial contribution is a comprehensive, constructive proof for polynomial representation using deep ReLU$^k$ networks. This allows us to establish an upper bound on both the size and count of network parameters. Consequently, we are able to demonstrate a suboptimal approximation rate for functions from Sobolev spaces as well as for analytic functions. Additionally, through an exploration of the representation power of deep ReLU$^k$ networks for shallow networks, we reveal that deep ReLU$^k$ networks can approximate functions from a range of variation spaces, extending beyond those generated solely by the ReLU$^k$ activation function. This finding demonstrates the adaptability of deep ReLU$^k$ networks in approximating functions within various variation spaces.
arXiv.org Artificial Intelligence
Jan-10-2024
- Country:
- Asia > Middle East
- Saudi Arabia (0.14)
- North America > United States
- Pennsylvania (0.14)
- Asia > Middle East
- Genre:
- Research Report > New Finding (0.66)
- Technology: