Lipschitz neural networks are dense in the set of all Lipschitz functions

Eckstein, Stephan

arXiv.org Machine Learning 

This note shows, under mild assumptions on the activation function, that the addition of a Lipschitz constraint does not inhibit the expressiveness of neural networks. The main result is the following: Theorem 1. Let ϕ be one time continuously differentiable and not polynomial, or let ϕ be the ReLU.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found