The Many Faces of 1-Lipschitz Neural Networks
Béthune, Louis, González-Sanz, Alberto, Mamalet, Franck, Serrurier, Mathieu
–arXiv.org Artificial Intelligence
Lipschitz constrained models have been used to solve specifics deep learning problems such as the estimation of Wasserstein distance for GAN, or the training of neural networks robust to adversarial attacks. Regardless the novel and effective algorithms to build such 1-Lipschitz networks, their usage remains marginal, and they are commonly considered as less expressive and less able to fit properly the data than their unconstrained counterpart. The goal of this paper is to demonstrate that, despite being empirically harder to train, 1-Lipschitz neural networks are theoretically better grounded than unconstrained ones when it comes to classification. We recall some results about 1-Lipschitz functions in the scope of deep learning and we extend and illustrate them to derive general properties for classification. We propose and demonstrate several new properties of 1-Lipschitz neural networks for classification. First, we show they can fit arbitrarily difficult frontiers, making them as expressive as classical ones, in addition to provide robustness certificates. We prove that when minimizing cross entropy loss the optimization problem under Lipschitz constraint is well posed and its solution generalizes well in the limit of big datasets, whereas regular neural networks can diverge even on remarkably simple situations. Then, we study the link between classification with 1-Lipschitz network and optimal transport thanks to regularized versions of Kantorovich-Rubinstein duality theory. Last, we derive preliminary bounds on their VC dimensions.
arXiv.org Artificial Intelligence
Apr-13-2021
- Country:
- North America > United States (0.28)
- Genre:
- Research Report (1.00)
- Industry:
- Government (0.34)
- Information Technology (0.48)
- Technology: