Supplementary Material of Rational neural networks
–Neural Information Processing Systems
Finally, we use the identity ReLU( x) = |x | + x 2, x R, to define a rational approximation to the ReLU function on the interval [ 1, 1] as r (x) = 1 2 null xr ( x) 1 + null + x null . Therefore, we have the following inequalities for x [ 1, 1], | ReLU( x) r (x) | = 1 2 null null null null | x| xr ( x) 1 + null null null null null 1 2(1 + null) (||x | xr (x) | + null| x |) null 1 + null null. We now show that ReLU neural networks can approximate rational functions. The structure of the proof closely follows [12, Lemma 1.3]. The statement of Theorem 3 comes in two parts, and we prove them separately.
Neural Information Processing Systems
Aug-15-2025, 13:51:07 GMT
- Country:
- Asia > Russia (0.04)
- Europe
- Russia (0.04)
- United Kingdom > England
- Oxfordshire > Oxford (0.04)
- North America
- Canada (0.04)
- United States > New York
- Tompkins County > Ithaca (0.04)
- Technology: