Goto

Collaborating Authors

 lipschitz bound


Lipschitz Bounds and Provably Robust Training by Laplacian Smoothing

Neural Information Processing Systems

In this work we propose a graph-based learning framework to train models with provable robustness to adversarial perturbations. In contrast to regularization-based approaches, we formulate the adversarially robust learning problem as one of loss minimization with a Lipschitz constraint, and show that the saddle point of the associated Lagrangian is characterized by a Poisson equation with weighted Laplace operator. Further, the weighting for the Laplace operator is given by the Lagrange multiplier for the Lipschitz constraint, which modulates the sensitivity of the minimizer to perturbations.


Improved Scalable Lipschitz Bounds for Deep Neural Networks

Syed, Usman, Hu, Bin

arXiv.org Machine Learning

Computing tight Lipschitz bounds for deep neural networks is crucial for analyzing their robustness and stability, but existing approaches either produce relatively conservative estimates or rely on semidefinite programming (SDP) formulations (namely the LipSDP condition) that face scalability issues. Building upon ECLipsE-Fast, the state-of-the-art Lipschitz bound method that avoids SDP formulations, we derive a new family of improved scalable Lipschitz bounds that can be combined to outperform ECLipsE-Fast. Specifically, we leverage more general parameterizations of feasible points of LipSDP to derive various closed-form Lipschitz bounds, avoiding the use of SDP solvers. In addition, we show that our technique encompasses ECLipsE-Fast as a special case and leads to a much larger class of scalable Lipschitz bounds for deep neural networks. Our empirical study shows that our bounds improve ECLipsE-Fast, further advancing the scalability and precision of Lipschitz estimation for large neural networks.


Review for NeurIPS paper: Lipschitz Bounds and Provably Robust Training by Laplacian Smoothing

Neural Information Processing Systems

Additional Feedback: I am not particular familiar with existing theory on solving Lipschitz constrained optimization, but the main theory in this paper looks novel. The paper currently looks like unfinished and has some missing details, especially on how the proposed algorithm is applied to MNIST. Also it is worthwhile to study a few other datasets with different complexity, to show their "fundamental lower bound" (line 18). The provably robust training scheme needs to be demonstrated on a few small datasets. I believe this paper has the potential to become a very good paper, but currently it looks incomplete for this conference.


Lipschitz Bounds and Provably Robust Training by Laplacian Smoothing

Neural Information Processing Systems

In this work we propose a graph-based learning framework to train models with provable robustness to adversarial perturbations. In contrast to regularization-based approaches, we formulate the adversarially robust learning problem as one of loss minimization with a Lipschitz constraint, and show that the saddle point of the associated Lagrangian is characterized by a Poisson equation with weighted Laplace operator. Further, the weighting for the Laplace operator is given by the Lagrange multiplier for the Lipschitz constraint, which modulates the sensitivity of the minimizer to perturbations. Our analysis establishes a novel connection between elliptic operators with constraint-enforced weighting and adversarial learning. We also study the complementary problem of improving the robustness of minimizers with a margin on their loss, formulated as a loss-constrained minimization problem of the Lipschitz constant.