Higher Order Approximation Rates for ReLU CNNs in Korobov Spaces
–arXiv.org Artificial Intelligence
This paper investigates the $L_p$ approximation error for higher order Korobov functions using deep convolutional neural networks (CNNs) with ReLU activation. For target functions having a mixed derivative of order m+1 in each direction, we improve classical approximation rate of second order to (m+1)-th order (modulo a logarithmic factor) in terms of the depth of CNNs. The key ingredient in our analysis is approximate representation of high-order sparse grid basis functions by CNNs. The results suggest that higher order expressivity of CNNs does not severely suffer from the curse of dimensionality.
arXiv.org Artificial Intelligence
Jan-20-2025
- Country:
- Asia > China
- Zhejiang Province > Hangzhou (0.04)
- Europe > Germany
- North Rhine-Westphalia > Upper Bavaria > Munich (0.04)
- Asia > China
- Genre:
- Research Report (1.00)
- Technology: