Lipschitz neural networks are dense in the set of all Lipschitz functions
This note shows, under mild assumptions on the activation function, that the addition of a Lipschitz constraint does not inhibit the expressiveness of neural networks. The main result is the following: Theorem 1. Let ϕ be one time continuously differentiable and not polynomial, or let ϕ be the ReLU.
Sep-29-2020