Response to reviewer 1
–Neural Information Processing Systems
W eakness 1: The result for neural network is an upper bound which may not be tight. Theorem 4 is just an upper bound on the approximation error, and a lower bound for NN is currently missing. This seems quite intuitive by parameter counting. We agree that this is an important question to be addressed. Experimentally, it is also easy to check that replacing ReLU by a smoothed ReLU does not change the numerical results.
Neural Information Processing Systems
Aug-15-2025, 16:34:09 GMT
- Technology: